Novel Multi Center and Threshold Ternary Pattern Based Method for Disease Detection Method Using Voice

Smart health is one of the most popular and important components of smart cities. It is a relatively new context-aware healthcare paradigm influenced by several fields of expertise, such as medical informatics, communications and electronics, bioengineering, ethics, to name a few. Smart health is used to improve healthcare by providing many services such as patient monitoring, early diagnosis of disease and so on. The artificial neural network (ANN), support vector machine (SVM) and deep learning models, especially the convolutional neural network (CNN), are the most commonly used machine learning approaches where they proved to be performance in most cases. Voice disorders are rapidly spreading especially with the development of medical diagnostic systems, although they are often underestimated. Smart health systems can be an easy and fast support to voice pathology detection. The identification of an algorithm that discriminates between pathological and healthy voices with more accuracy is needed to obtain a smart and precise mobile health system. The main contribution of this paper consists of proposing a multiclass-pathologic voice classification using a novel multileveled textural feature extraction with iterative feature selector. Our approach is a simple and efficient voice-based algorithm in which a multi-center and multi threshold based ternary pattern is used (MCMTTP). A more compact multileveled features are then obtained by sample-based discretization techniques and Neighborhood Component Analysis (NCA) is applied to select features iteratively. These features are finally integrated with MCMTTP to achieve an accurate voice-based features detection. Experimental results of six classifiers with three diagnostic diseases (frontal resection, cordectomy and spastic dysphonia) show that the fused features are more suitable for describing voice-based disease detection.

*Published in the IEEE Electronics Packaging Society Section within IEEE Access.

View this article on IEEE Xplore


AI and IoT Convergence for Smart Health

Submission Deadline:  31 May 2021

IEEE Access invites manuscript submissions in the area of AI and IoT Convergence for Smart Health.   

With the development of smart sensorial media, things, and cloud technologies, “Smart healthcare” is getting remarkable attention from academia, government, industry, and  healthcare communities. Recently, the Internet of Things (IoT) has brought the vision of a smarter world into reality with a massive amount of data and numerous services. With the outbreak of COVID-19, Artificial Intelligence (AI) has gained significant attention by utilizing its machine learning algorithms for quality patient care. However, the convergence of IoT and AI can provide new opportunities for both technologies. AI-driven IoT can play a significant role in smart healthcare by offering better insight of healthcare data to support affordable personalized care. It can also support powerful processing and storage facilities of huge IoT data streams (big data) beyond the capability of individual “things,” as well as to provide automated decision making in real-time. While researchers have been making advances in the study of AI-and IoT for health services individually, very little attention has been given to developing cost-effective and affordable smart healthcare services. The AI-driven IoT (AIIoT) for smart healthcare has the potential to revolutionize many aspects of our healthcare industry; however, many technical challenges need to be addressed before this potential can be realized.

This Special Section is intended to report high-quality research on recent advances toward AI- and IoT convergence for smart healthcare, more specifically to the state-of-the-art approaches, methodologies, and systems for the design, development, deployment and innovative use of those convergence technologies to provide insight into smart healthcare service demands. Authors are solicited to submit complete articles, not previously published elsewhere, in the following topics. 

The topics of interest include, but are not limited to:

  • AI-empowered innovative classification techniques and testbeds for healthcare in IoT-cloud platform
  • AI- empowered big data analytics and cognitive computing for smart health monitoring
  • Advanced AIIoT convergent services, systems, infrastructure and techniques for healthcare
  • AI-supported IoT data analytics for smart healthcare
  • Machine learning-based smart homecare for mobile-enabled fall detection of disabled or elderly people
  • AIIoT-empowered data analysis for COVID-19
  • AI-enabled contact tracing for preventing the spread of the COVID-19
  • AI and IoT convergence for pandemic management and monitoring
  • Intelligent IoT-driven diagnosis and prognosis mechanisms for infectious diseases
  • IoT cloud-based predictive analysis for personalized healthcare
  • AI- supported healthcare in IoT-cloud platform
  • AIIoT- supported approaches and testbeds for social distance monitoring in pandemic prevention
  • Security, privacy, and trust of AI-IoT convergent smart healthcare system

We also highly recommend the submission of multimedia with each article as it significantly increases the visibility and downloads of articles.


Associate Editor:  M. Shamim Hossain, King Saud University, Saudi Arabia

Guest Editors:

    1. Stefan Goebel, Technical University Darmstadt, Germany
    2. Abdulsalam Yassine, Lakehead University, Canada
    3. Diana P. Tobón, Universidad de Medellín, Colombia
    4. Fakhri Karray, University of Waterloo, Canada


Relevant IEEE Access Special Sections:

    1. Deep Learning Algorithms for Internet of Medical Things
    2. Behavioral Biometrics for eHealth and Well-Being
    3. Emerging Deep Learning Theories and Methods for Biomedical Engineering


IEEE Access Editor-in-Chief:  Prof. Derek Abbott, University of Adelaide

Article submission: Contact Associate Editor and submit manuscript to:

 For inquiries regarding this Special Section, please contact: