Adaptation of applications to compare development frameworks in deep learning for decentralized android applications

  1. Beatriz Sainz-de-Abajo 1
  2. Sergio Laso
  3. Jose Garcia-Alonso 2
  4. Javier Berrocal 2
  1. 1 Universidad de Valladolid
    info

    Universidad de Valladolid

    Valladolid, España

    ROR https://ror.org/01fvbaw18

  2. 2 Universidad de Extremadura
    info

    Universidad de Extremadura

    Badajoz, España

    ROR https://ror.org/0174shg90

Revista:
IJIMAI

ISSN: 1989-1660

Any de publicació: 2023

Volum: 8

Número: 2

Pàgines: 224-231

Tipus: Article

DOI: 10.9781/IJIMAI.2023.04.006 DIALNET GOOGLE SCHOLAR lock_openDialnet editor

Altres publicacions en: IJIMAI

Resum

Not all frameworks used in machine learning and deep learning integrate with Android, which requires some prerequisites. The primary objective of this paper is to present the results of the analysis and a comparison of deep learning development frameworks, which can be adapted into fully decentralized Android apps from a cloud server. As a work methodology, we develop and/or modify the test applications that these frameworks offer us a priori in such a way that it allows an equitable comparison of the analysed characteristics of interest. These parameters are related to attributes that a user would consider, such as (1) percentage of success; (2) battery consumption; and (3) power consumption of the processor. After analysing numerical results, the proposed framework that best behaves in relation to the analysed characteristics for the development of an Android application is TensorFlow, which obtained the best score against Caffe2 and Snapdragon NPE in the percentage of correct answers, battery consumption, and device CPU power consumption. Data consumption was not considered because we focus on decentralized cloud storage applications in this study.

Referències bibliogràfiques

  • Y. Duan, J. S. Edwards, and Y. K. Dwivedi, “Artificial intelligence for decision making in the era of big data – evolution, challenges and research agenda,” International Journal of Information Management, vol. 48, pp. 63–71, 2019, doi: 10.1016/j.ijinfomgt.2019.01.021.
  • L. Spector, “Evolution of artificial intelligence,” Artificial Intelligence, vol. 170, no. 18. pp. 1251–1253, Dec. 2006, doi: 10.1016/j.artint.2006.10.009.
  • M. S. Mahdavinejad, M. Rezvan, M. Barekatain, P. Adibi, P. Barnaghi, and A. P. Sheth, “Machine learning for internet of things data analysis: a survey,” Digital Communications and Networks, vol. 4, no. 3. pp. 161–175, 2018, doi: 10.1016/j.dcan.2017.10.002.
  • A. Shrestha and A. Mahmood, “Review of deep learning algorithms and architectures,” IEEE Access, vol. 7. pp. 53040–53065, 2019, doi: 10.1109/ International Journal of Interactive Multimedia and Artificial Intelligence, Vol. 8, Nº2 - 230 - ACCESS.2019.2912200.
  • W. Samek and K. R. Müller, “Towards explainable artificial intelligence,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11700 LNCS, 2019, pp. 5–22.
  • N. Bouchra, A. Aouatif, N. Mohammed, and H. Nabil, “Deep belief network and auto-encoder for face classification,” International Journal of Interactive Multimedia and Artificial Intelligence, vol. 5, no. 5, p. 22, 2019, doi: 10.9781/ijimai.2018.06.004.
  • F. J. García-Peñalvo et al., “Application of artificial intelligence algorithms within the medical context for non-specialized users: The cartier-ia platform,” International Journal of Interactive Multimedia and Artificial Intelligence, vol. 6, no. 6, 2021, doi: 10.9781/ijimai.2021.05.005.
  • S. H. Chen, C. W. Wang, I. H. Tai, K. P. Weng, Y. H. Chen, and K. S. Hsieh, “Modified yolov4-densenet algorithm for detection of ventricular septal defects in ultrasound images,” International Journal of Interactive Multimedia and Artificial Intelligence, vol. 6, no. 7, 2021, doi: 10.9781/ ijimai.2021.06.001.
  • M. I. Khattak, M. Al-Hasan, A. Jan, N. Saleem, E. Verdú, and N. Khurshid, “Automated detection of covid-19 using chest x-ray images and ct scans through multilayer-spatial convolutional neural networks,” International Journal of Interactive Multimedia and Artificial Intelligence, vol. 6, no. 6, 2021, doi: 10.9781/ijimai.2021.04.002.
  • A. Venkat, T. Rusira, R. Barik, M. Hall, and L. Truong, “SWIRL: highperformance many-core CPU code generation for deep neural networks,” International Journal of High Performance Computing Applications, vol. 33, no. 6, 2019, doi: 10.1177/1094342019866247.
  • S. S. Nisha, M. M. Sathik, and M. N. Meeral, “Application, algorithm, tools directly related to deep learning,” in Handbook of Deep Learning in Biomedical Engineering: Techniques and Applications, 2020.
  • Y. Xin et al., “Machine learning and deep learning methods for cybersecurity,” IEEE Access, vol. 6, pp. 35365–35381, 2018, doi: 10.1109/ ACCESS.2018.2836950.
  • S. Dargan, M. Kumar, M. R. Ayyagari, and G. Kumar, “A survey of deep learning and its applications: a new paradigm to machine learning,” Archives of Computational Methods in Engineering, vol. 27, no. 4, pp. 1071–1092, 2020, doi: 10.1007/s11831-019-09344-w.
  • N. El Aboudi and L. Benhlima, “Big data management for healthcare systems: architecture, requirements, and implementation,” Advances in Bioinformatics, vol. 2018. 2018, doi: 10.1155/2018/4059018.
  • F. L. Koch, “Decentralized network management using distributed artificial intelligence,” Journal of Network and Systems Management, vol. 9, no. 4, pp. 375–388, 2001, doi: 10.1023/A:1012976206591.
  • I. Gupta, “Decentralization of artificial intelligence: analyzing developments in decentralized learning and distributed AI networks,” Researchgate.Net, no. May, 2020, doi: 10.13140/RG.2.2.17018.93124.
  • G. A. Montes and B. Goertzel, “Distributed, decentralized, and democratized artificial intelligence,” Technological Forecasting and Social Change, vol. 141. pp. 354–358, 2019, doi: 10.1016/j.techfore.2018.11.010.
  • Z. Wang, K. Liu, J. Li, Y. Zhu, and Y. Zhang, “Various frameworks and libraries of machine learning and deep learning: a survey,” Archives of Computational Methods in Engineering, 2019, doi: 10.1007/s11831-018- 09312-w.
  • Google, “TensorFlow Lite guide,” TensorFlow, 2020.
  • Tutorials Point, “TensorFlow tutorial,” Tutorials Point Pvt. Ltd., p. 90, 2019.
  • M. Abadi et al., “TensorFlow: a system for large-scale machine learning,” in Proceedings of the 12th USENIX Symposium on Operating Systems Design and Implementation, OSDI 2016, Savannah, GA, USA, 2016, pp. 265–283, doi: 10.5555/3026877.3026899.
  • Y. Jia et al., “Caffe: convolutional architecture for fast feature embedding,” in MM 2014 - Proceedings of the 2014 ACM Conference on Multimedia, Nov. 2014, pp. 675–678, doi: 10.1145/2647868.2654889.
  • Facebook, “Caffe2 and PyTorch join forces to create a research + production platform PyTorch 1.0,” Caffe2 Documentation, 2018.
  • A. Mishra, “Amazon machine learning,” in Machine Learning in the AWS Cloud, 2019, pp. 317–351.
  • F. Seide and A. Agarwal, “CNTK : Microsoft’s open-source deep-learning toolkit,” in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 2016, pp. 2135-2135, doi: 10.1145/2939672.2945397.
  • R. Collobert, C. Farabet, and K. Kavukcuoğlu, “Torch | Scientific computing for LuaJIT.,” NIPS Workshop on Machine Learning Open Source Software, 2008.
  • S. Lang, F. Bravo-Marquez, C. Beckham, M. Hall, and E. Frank, “WekaDeeplearning4j: a deep learning package for Weka based on deeplearning4j,” Knowledge-Based Systems, vol. 178, pp. 48–50, Aug. 2019, doi: 10.1016/j.knosys.2019.04.013.
  • M. Abadi et al., “TensorFlow, Large-scale machine learning on heterogeneous systems.” 2015, doi: 10.5281/zenodo.4724125.
  • Facebook, “AICamera application,” 2017. https://github.com/ facebookarchive/AICamera (accessed Jul. 28, 2022).
  • ©2022 Qualcomm Technologies Inc. and/or its affiliated companies, “Qualcomm neural processing SDK for AI,” 2022. https://developer. qualcomm.com/software/qualcomm-neural-processing-sdk (accessed Jul. 28, 2022).
  • Eclipse Foundation, “deeplearning4j,” github.com, 2019. https://github. com/eclipse/deeplearning4j (accessed Jul. 28, 2022).
  • Google Developers, “Analyze power use with battery historian,” 2022. https://developer.android.com/topic/performance/power/batteryhistorian (accessed Jul. 28, 2022).
  • Google Developers, “The Android profiler,” 2022. https://developer. android.com/studio/profile/android-profiler (accessed Jul. 28, 2022).
  • A. Chowanda and R. Sutoyo, “Convolutional neural network for face recognition in mobile phones,” ICIC Express Letters, vol. 13, no. 7, pp. 569–574, 2019, doi: 10.24507/icicel.13.07.569.
  • H. C. Takawale and A. Thakur, “Talos App: on-device machine learning using TensorFlow to detect Android malware,” in 2018 Fifth International Conference on Internet of Things: Systems, Management and Security, Oct. 2018, pp. 250–255, doi: 10.1109/IoTSMS.2018.8554572.