Please use this identifier to cite or link to this item: http://hdl.handle.net/20.500.12188/17170
DC FieldValueLanguage
dc.contributor.authorPopovska-Mitrovikj, Aleksandraen_US
dc.contributor.authorBakeva, Vericaen_US
dc.contributor.authorMechkaroska, Danielaen_US
dc.date.accessioned2022-03-29T12:32:25Z-
dc.date.available2022-03-29T12:32:25Z-
dc.date.issued2020-
dc.identifier.urihttp://hdl.handle.net/20.500.12188/17170-
dc.description.abstractRandom Codes Based on Quasigroups (RCBQ) are cryptcodes that provide a correction of transmission errors and an information security, all with one algorithm. Standard algorithm, Cut-Decoding and 4-Sets-Cut-Decoding algorithms are different versions of RCBQ and they are proposed elsewhere. The decoding in all these algorithms is list decoding, so the speed of the decoding depends on the list size. In order to decrease the list size, Fast-Cut-Decoding and Fast-4-Sets-Cut-Decoding algorithms are proposed elsewhere and they improve performances of these codes for transmission through a Gaussian channel. Here, we propose a new modification of these algorithms to improve their properties for transmission through a burst channels.en_US
dc.language.isoenen_US
dc.publisherSpringer International Publishingen_US
dc.subjectCryptcoding, Burst errors, Gilbert-Elliott channel, SNR, Quasigroupen_US
dc.titleFast Decoding with Cryptcodes for Burst Errorsen_US
dc.typeProceeding articleen_US
dc.relation.conferenceMachine Learning and Applications. ICT Innovations 2020.en_US
dc.identifier.doi10.1007/978-3-030-62098-1_14-
dc.identifier.urlhttps://link.springer.com/content/pdf/10.1007/978-3-030-62098-1_14-
item.grantfulltextnone-
item.fulltextNo Fulltext-
crisitem.author.deptFaculty of Computer Science and Engineering-
Appears in Collections:Faculty of Computer Science and Engineering: Conference papers
Show simple item record

Page view(s)

42
checked on Jul 24, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.