SYNTHESIS OF BIOMEDICAL IMAGES BASED ON GENERATIVE ADVERSARIAL NETWORKS

2019;
: 35-40
https://doi.org/10.23939/ujit2019.01.035
Received: October 15, 2019
Accepted: November 20, 2019

Цитування за ДСТУ: Березький О. М., Лящинський П. М., Лящинський П. М., Сухович А. Р., Долинюк Т. М. Синтез біомедичних зображень на основі генеративно-змагальних мереж. Український журнал інформаційних технологій. 2019, т. 1, № 1. С. 35–40.

Citation APA: Berezsky, O. M., Liashchynskyi, P. B., Liashchinskyi, P. B., Sukhovych, A. R., & Dolynyuk, T. M. (2019). Synthesis of biomedical images based on generative adversarial networks. Ukrainian Journal of Information Technology, 1(1), 35–40. https://doi.org/10.23939/ujit2019.01.035

1
Ternopil National University, Ternopil, Ukraine; Lviv Polytechnic National University, Lviv, Ukraine
2
Ternopil National University
3
Ternopil National University
4
Ternopil National University
5
Ternopil National University

Mo­dern da­ta­ba­ses of bi­ome­di­cal ima­ges ha­ve be­en in­ves­ti­ga­ted. Bi­ome­di­cal ima­ging has be­en shown to be ex­pen­si­ve and ti­me con­su­ming. A da­ta­ba­se of ima­ges of pre­can­ce­ro­us and can­ce­ro­us bre­asts "BPCI2100" was de­ve­lo­ped. The da­ta­ba­se con­sists of 2,100 ima­ge fi­les and a MySQL da­ta­ba­se of me­di­cal re­se­arch in­for­ma­ti­on (pa­ti­ent in­for­ma­ti­on and ima­ge fe­atu­res). Ge­ne­ra­ti­ve ad­ver­sa­ri­al net­works (GAN) ha­ve be­en fo­und to be an ef­fec­ti­ve me­ans of ima­ge ge­ne­ra­ti­on. The archi­tec­tu­re of the ge­ne­ra­ti­ve ad­ver­sa­ri­al net­work con­sis­ting of a ge­ne­ra­tor and a discri­mi­na­tor has be­en de­ve­lo­ped.The discri­mi­na­tor is a de­ep con­vo­lu­ti­onal neu­ral net­work with co­lor ima­ges of 128×128 pi­xels. This net­work con­sists of six con­vo­lu­ti­onal la­yers with a win­dow si­ze of 5×5 pi­xels. Le­aky Re­LU type ac­ti­va­ti­on functi­on for con­vo­lu­ti­onal la­yers is used. The last la­yer used a sig­mo­id ac­ti­va­ti­on functi­on. The ge­ne­ra­tor is a neu­ral net­work con­sis­ting of a fully con­nec­ted la­yer and se­ven de­con­vo­lu­ti­on la­yers with a 5×5 pi­xel win­dow si­ze. Le­aky Re­LU ac­ti­va­ti­on functi­on is used for all la­yers. The last la­yer uses the hyper­bo­lic tan­gent ac­ti­va­ti­on functi­on. Go­og­le Clo­ud Com­pu­te Instan­ce to­ols ha­ve be­en used to tra­in the the ge­ne­ra­ti­ve ad­ver­sa­ri­al net­work. Ge­ne­ra­ti­on of his­to­lo­gi­cal and cyto­lo­gi­cal ima­ges on the ba­sis of the ge­ne­ra­ti­ve ad­ver­sa­ri­al net­work is con­duc­ted. As a re­sult, the tra­ining sample for clas­si­fi­ers has be­en sig­ni­fi­cantly incre­ased.

Ori­gi­nal his­to­lo­gi­cal ima­ges are di­vi­ded in­to 5 clas­ses, cyto­lo­gi­cal ima­ges in­to 4 clas­ses. The ori­gi­nal sample si­ze for his­to­lo­gi­cal ima­ges is 91 ima­ges, for cyto­lo­gi­cal ima­ges – 78 ima­ges. Tra­ining samples ha­ve be­en ex­pan­ded to 1000 ima­ges by af­fi­ne transfor­ma­ti­ons (shift, zo­om, ro­ta­te, ref­lec­ti­on). Stud­ying the clas­si­fi­er on the ori­gi­nal sample yi­el­ded an ac­cu­racy of ≈84 % for his­to­lo­gi­cal ima­ges and ≈75 % for cyto­lo­gi­cal ima­ges, res­pec­ti­vely. On the sample of the ge­ne­ra­ted ima­ges, the ini­ti­al clas­si­fi­ca­ti­on ac­cu­racy was ≈96.5 % for his­to­lo­gi­cal ima­ges and ≈95.5 for cyto­lo­gi­cal ima­ges. The ac­cu­racy ga­in is ≈12 % for his­to­lo­gi­cal ima­ges and ≈20.5 % for cyto­lo­gi­cal ima­ges. The per­for­med clas­si­fi­ca­ti­on of his­to­lo­gi­cal and cyto­lo­gi­cal ima­ges sho­wed that the incre­ase in clas­si­fi­ca­ti­on ac­cu­racy was ≈12 % for his­to­lo­gi­cal ima­ges and ≈20.5 % for cyto­lo­gi­cal ima­ges. Com­pu­ter ex­pe­ri­ments ha­ve shown that the ti­me of study of the ge­ne­ra­ti­ve ad­ver­sa­ri­al net­work for his­to­lo­gi­cal ima­ges was ≈9 ho­urs, for cyto­logy – ≈ 8.5 ho­urs. Pros­pects for further re­se­arch are the pa­ral­le­li­za­ti­on of al­go­rithms for tra­ining ge­ne­ra­ti­ve-com­pe­ti­ti­ve net­works.

[1]     Ame­ri­can So­ci­ety. (2019). On­li­ne at­las of the Ame­ri­can So­ci­ety for Cyto­pat­ho­logy. Ret­ri­eved from: https://bet­hes­da.soc.wisc.edu/. (Da­te of ap­pe­al: Sep­tem­ber 2019). [In Uk­ra­ini­an].

[2]     Benchmark da­ta­set. (2019). UCSB Bio-Seg­men­ta­ti­on Benchmark da­ta­set for tes­ting com­pu­ter vi­si­on al­go­rithms, inclu­ding seg­men­ta­ti­on and clas­si­fi­ca­ti­on. Ret­ri­eved from: https://bioimage.ucsb.edu/research/bio-segmentation. (Da­te of ap­pe­al: Sep­tem­ber 2019). [In Uk­ra­ini­an].

[3]     Be­rezsky, O. M., Bat­ko, Y. M., Be­rez­ka, K. M., Ver­bov­yi, S. O., et al. (2017). Met­hods, al­go­rithms and softwa­re for pro­ces­sing bi­ome­di­cal ima­ges. Ter­no­pil: Eco­no­mic Tho­ught, TNEU, 350 p. [In Uk­ra­ini­an].

[4]     Be­rezsky, O. M., Melnyk, G. M., Ni­kol­yuk, V. D., & Datsko, T. V. (2013). Da­ta­ba­se of di­gi­tal his­to­lo­gi­cal and cyto­lo­gi­cal ima­ges of va­ri­ous forms of bre­ast can­cer "CIFDB": Cer­ti­fi­ca­te of copyright re­gistra­ti­on to the work № 52743 of 23.12.2013. [In Uk­ra­ini­an].

[5]     Cres­well, An­to­nia, Whi­te, Tom, Du­mou­lin, Vin­cent, Arul­ku­ma­ran, Kai, Sen­gup­ta, Bis­wa, & Bha­rath, Anil A. (2018). Ge­ne­ra­ti­ve Ad­ver­sa­ri­al Net­works, an over­vi­ew, (pp. 53–65). https://doi.org/10.1109/MSP.2017.2765202.

[6]     Fi­ne-ne­ed­le As­pi­ra­te. (2019). Bre­ast Can­cer Wis­con­sin (Di­ag­nos­tic) Da­ta Set. The signs we­re ob­ta­ined from di­gi­ti­zed ima­ges of the Fi­ne-ne­ed­le As­pi­ra­te (FNA) bre­ast tis­sue. The fe­atu­res ref­lect the cha­rac­te­ris­tics of the cell nuc­lei. Ret­ri­eved from: http://archi­ve.ics.uci.edu/ml/da­ta­sets/Bre­ast+Can­cer+Wis­con­sin+%28Di­ag­nos­tic29 (Da­te of ap­pe­al: Sep­tem­ber 2019). [In Uk­ra­ini­an].

[7]     Frid-Adar, M., Idit Di­amant, Eyal Khang, Mic­hal Ami­tai, Ja­cob Goldber­ger and Ha­yit Gre­enspan. (2018). GAN-ba­sed Synthe­tic Me­di­cal Ima­ge Aug­men­ta­ti­on for incre­ased CNN Per­for­man­ce in Li­ver Le­si­on Clas­si­fi­ca­ti­on, (pp. 321–331). https://doi.org/10.1016/j.neucom.2018.09.013

[8]     Gui­bas, John T.,              Vir­di, Tej­pal S., & Li, Pe­ter S. (2017). Synthe­tic Me­di­cal Ima­ges from Du­al Ge­ne­ra­ti­ve Ad­ver­sa­ri­al Net­works. 31st Con­fe­ren­ce on Neu­ral In­for­ma­ti­on Pro­ces­sing Systems (NIPS2017) and LongBe­ach and CA and USA, (p. 9).

[9]     He Zhao, Hui­qi Li, Se­bas­ti­an Mau­rer-Stroh, & Li Cheng. (2018). Synthe­si­zing Re­ti­nal and Neu­ro­nal Ima­ges with Ge­ne­ra­ti­ve. Ad­ver­sa­ri­al Nets, jo­ur­nal = {Me­di­cal Ima­ge Analysis}, chap­ter= {49}, S1361-8415(18)30459-6. , (pp. 14–26). https://doi.org/10.1016/j.media.2018.07.001.

[10]  His­to­logy Gui­de. (2019). Vir­tu­al his­to­logy la­bo­ra­tory: His­to­logy Gui­de. Ret­ri­eved from: http://www.his­to­logygui­de.com/in­dex.html (Date of appeal: Sep­tem­ber 2019). [In Uk­ra­ini­an].

[11]  Kan­sas Scho­ol. (2019). His­to­lo­gi­cal on­li­ne at­las for la­bo­ra­tory and cell bi­ology co­ur­ses at the Uni­ver­sity of Kan­sas Scho­ol of Me­di­ci­ne. Ret­ri­eved from: http://www.kumc.edu/instruction/medicine/anatomy/hisoweb/index.html (Date of appeal: Sep­tem­ber 2019). [In Uk­ra­ini­an].

[12]  Ka­ze­mi­niaa, Sa­lo­me, Ba­urb, Chris­toph, Ku­ij­perc, Ar­jan, In­ne­kend, Bram­van, Na­vabb, Nas­sir, Al­bar­qou­ni, Sha­di, Muk­ho­padhya­ya, Anir­ban. (2018). GANs for Me­di­cal Ima­ge Analysis", jo­ur­nal = {De­partment of Com­pu­ter Sci­en­ce and TU Darmstadt and Ger­many Com­pu­ter Aided Me­di­cal Pro­ce­du­res (CAMP) and TU Mu­nich Ger­many}, (pp. 1–41).

[13]  Lu­cas The­is. (2016). Lu­cas The­is Aaron "van den Oord y Uni­ver­sity of Tu­bin­gen" Ghent Uni­ver­sity 72072 Tu­bin­gen, Matthi­as Bethge, "A No­te On The Eval­ua­ti­on Of Ge­ne­ra­ti­ve Mo­dels", Pub­lis­hed as a con­fe­ren­ce pa­per at ICLR 2016, ar­Xiv:1511.01844v3 [stat. ML] 24 Apr 2016, p. 10.

[14]  Sa­li­mans, Tim, Go­od­fel­low, Ian, Za­rem­ba, Woj­ci­ech, Che­ung, Vic­ki, Rad­ford, Alec, & Chen, Xi. (2016). Impro­ved Techniq­ues for Tra­ining GANs. In ar­Xiv, (pp. 2234–2242).

[15]  Xi­an Wu, Kun Xu, & Pe­ter Hall. (2017). A Sur­vey of Ima­ge Synthe­sis and Edi­ting with Ge­ne­ra­ti­ve Ad­ver­sa­ri­al Net­works. Tsinghua sci­en­ce and techno­logy, 15. ISSNll1007-0214ll09/15llpp660-674.