DOI: 10.37421/0974-7230.2022.15.432
DOI: 10.37421/0974-7230.2022.15.433
Because of its growing popularity, the use of a piece of technology known as a digital signature is becoming more common. Its primary responsibilities include detecting and preventing unauthorised changes to the data as well as verifying the identity of the signature. Some of the possible applications for digital signatures include the signing of legally binding contracts, the protection of software updates and the use of digital certificates to ensure the security of online business transactions. Digital signatures have the potential to be used in a wide range of other situations. Because it uses public channels, a public key white box is the single most important example of a public key white box. This is in addition to the process of establishing keys through insecure channels, which is also a critical component of the equation. It is critical for ensuring the security of monetary transactions that take place over open or insecure networks. Digital signature techniques are commonly used in white-box cryptographic protocols. This allows for the provision of services like entity authentication, authenticated key transfer and key agreement.
DOI: 10.37421/0974-7230.2022.15.434
DOI: 10.37421/0974-7230.2022.15.435
DOI: 10.37421/0974-7230.2022.15.436
DOI: 10.37421/0974-7230.2024.17.531
DOI: 10.37421/0974-7230.2024.17.532
DOI: 10.37421/0974-7230.2024.17.533
DOI: 10.37421/0974-7230.2024.17.534
DOI: 10.37421/0974-7230.2024.17.536
DOI: 10.37421/0974-7230.2024.17.537
DOI: 10.37421/0974-7230.2024.17.538
DOI: 10.37421/0974-7230.2024.17.539
DOI: 10.37421/0974-7230.2024.17.540
DOI: 10.37421/0974-7230.2024.17.535
DOI: 10.37421/0974-7230.2024.17.512
DOI: 10.37421/0974-7230.2024.17.513
With the rapid growth of cloud computing, energy consumption in data centers has become a significant concern due to its environmental impact and operational costs. Green cloud computing aims to minimize energy consumption and carbon emissions by employing energy-efficient technologies and practices. Scheduling algorithms play a crucial role in optimizing resource utilization and reducing energy consumption in cloud environments. This research article explores various energy-efficient scheduling algorithms for green cloud computing, including task scheduling, virtual machine allocation, and workload consolidation techniques. We discuss the underlying principles, challenges, and opportunities of these algorithms, along with practical implementations and case studies demonstrating their effectiveness in improving energy efficiency and sustainability in cloud data centers.
DOI: 10.37421/0974-7230.2024.17.514
The advent of latency-sensitive applications, coupled with the proliferation of Internet of Things devices, has heightened the demand for realtime data processing and low-latency communication. Edge computing and cloud computing have emerged as two complementary paradigms to address these requirements. Edge devices bring computational resources closer to data sources, reducing latency and bandwidth usage, while cloud infrastructure offers scalability and flexibility. This paper explores the collaboration between edge and cloud computing to enhance the performance of latency-sensitive applications. We discuss the challenges, opportunities, and recent advancements in edge-cloud collaboration, along with case studies and future directions.
DOI: 10.37421/0974-7230.2024.17.515
Multi-cloud strategies, involving the use of multiple cloud service providers, have emerged as a critical approach to enhancing resilience and performance in cloud services. This paper explores the advantages, challenges, and best practices associated with multi-cloud deployments. By examining case studies and current research, we provide a comprehensive overview of how multi-cloud strategies can mitigate risks associated with vendor lock-in, improve disaster recovery capabilities, and optimize performance through load distribution.
DOI: 10.37421/0974-7230.2024.17.516
DOI: 10.37421/0974-7230.2024.17.517
DOI: 10.37421/0974-7230.2024.17.518
DOI: 10.37421/0974-7230.2024.17.518
DOI: 10.37421/0974-7230.2024.17.520
Quantum computing represents a transformative leap in computational capabilities, poised to solve problems deemed intractable for classical computers. Its integration with cloud services promises to revolutionize various industries by enhancing computational power, optimizing complex algorithms, and providing advanced security solutions. This article evaluates the potential impact of quantum computing on cloud services, examining the technological advancements, industry applications, security implications, and the challenges that lie ahead.
DOI: 10.37421/0974-7230.2024.17.521
DOI: 10.37421/0974-7230.2023.16.482
The explosive growth of data in the digital age has enabled organizations to derive valuable insights and knowledge through data mining. However, the extraction of knowledge from large datasets raises significant privacy concerns. This research article explores privacy-preserving data mining techniques, which balance the need for knowledge discovery with the imperative of safeguarding personal data. We discuss various methods to protect privacy during data mining, emphasizing their importance for secure and ethical knowledge discovery.
DOI: 10.37421/0974-7230.2023.16.483
DOI: 10.37421/0974-7230.2023.16.484
DOI: 10.37421/0974-7230.2023.16.485
DOI: 10.37421/0974-7230.2023.16.487
DOI: 10.37421/0974-7230.2023.16.488
The Industrial Internet of Things has revolutionized the way industries operate by providing real-time data from various sensors and devices. However, the vast amount of data generated in IIoT networks poses a significant challenge in identifying anomalies and potential security threats. In this research article, we explore the use of data mining and machine learning techniques for efficient anomaly detection in IIoT networks. We present a comprehensive analysis of various methodologies and tools that can be employed to enhance the security and reliability of industrial systems. Our findings suggest that a combination of feature engineering, supervised learning, and unsupervised learning techniques can lead to highly effective and efficient anomaly detection systems.
DOI: 10.37421/0974-7230.2023.16.489
Cloud computing has become a ubiquitous technology in the digital age, offering scalability, flexibility, and cost-efficiency to a wide range of applications and services. To harness the full potential of cloud resources, dynamic resource allocation is essential. This research article explores the application of Fuzzy Network Control for dynamic resource allocation in cloud networks. Fuzzy logic, with its ability to handle imprecise and uncertain information, offers a promising approach to optimizing resource allocation in dynamic cloud environments. The article presents an indepth analysis of the concept, its methodology, benefits, challenges, and potential future developments.
DOI: 10.37421/0974-7230.2023.16.490
In the era of big data, the challenge of extracting meaningful insights and knowledge from various domains has never been more significant. Cross-domain data mining has emerged as a powerful approach to leverage knowledge from one domain and apply it to another. This research article explores the techniques and applications of CDDM, focusing on knowledge transfer and generalization across different domains. We delve into the methodologies and tools that enable the seamless flow of information between domains, fostering innovation, efficiency, and improved decision-making.
DOI: 10.37421/0974-7230.2023.16. 491
DOI: 10.37421/0974-7230.2023.16.486
DOI: 10.37421/0974-7230.2023.16.472
DOI: 10.37421/0974-7230.2023.16.473
DOI: 10.37421/0974-7230.2023.16.474
DOI: 10.37421/0974-7230.2023.16.475
Neural networks have witnessed remarkable advancements in recent years, revolutionizing various fields such as computer vision, natural language processing, and reinforcement learning. This research article delves into the current state of neural networks and explores the potential breakthroughs and directions they may take in the future. We discuss key areas of development, including architecture enhancements, ethical considerations, and emerging applications, shedding light on the exciting prospects of neural networks in the years to come.
DOI: 10.37421/0974-7230.2023.16.476
DOI: 10.37421/0974-7230.2023.16.477
DOI: 10.37421/0974-7230.2023.16.478
The advent of neural networks in healthcare has ushered in a new era of precision medicine, transforming the way diseases are diagnosed and treated. Neural networks, a subset of artificial intelligence, have demonstrated remarkable capabilities in analyzing vast amounts of medical data, enhancing diagnostic accuracy, and personalizing treatment plans. This research article explores the profound impact of neural networks in healthcare, highlighting their applications, challenges, and potential future developments. Healthcare is undergoing a digital transformation, with data playing a central role in driving innovations in diagnosis and treatment. The integration of neural networks, a subfield of artificial intelligence inspired by the human brain's structure and function, has been instrumental in harnessing the power of healthcare data. Neural networks have shown immense promise in improving disease detection, treatment recommendations, and patient outcomes.
DOI: 10.37421/0974-7230.2023.16.479
Cloud computing has become an integral part of our digital infrastructure, powering a wide range of applications and services. However, as the demand for more computational power and data storage continues to grow, traditional cloud computing faces significant challenges. Quantum cloud computing, a novel paradigm that combines the principles of quantum mechanics with cloud services, offers a promising solution. This article explores the concept of quantum cloud computing, with a focus on leveraging quantum entanglement for enhanced performance and security in next-generation cloud services.
DOI: 10.37421/0974-7230.2023.16.480
Serverless computing has emerged as a paradigm-shifting technology in cloud computing, promising scalability, cost-efficiency, and reduced operational overhead. This research article explores the architectural patterns, performance optimization techniques, and real-world use cases of serverless computing in the cloud. We delve into the core concepts of serverless computing, its advantages, challenges, and practical implementations. Through a detailed analysis of architectural patterns and optimization strategies, we provide insights into how organizations can harness the full potential of serverless computing for their applications. Additionally, we present case studies illustrating the diverse range of applications and industries where serverless computing has made a significant impact.
DOI: 10.37421/0974-7230.2023.16.481
DOI: 10.37421/0974-7230.2023.16.452
DOI: 10.37421/0974-7230.2023.16.453
DOI: 10.37421/0974-7230.2023.16.454
DOI: 10.37421/0974-7230.2023.16.455
DOI: 10.37421/0974-7230.2023.16.456
DOI: 10.37421/0974-7230.2022.15.447
Several surveys of the findings of research into the power consumption of virtual entities (VEs, also known as virtual machines (VMs) or containers) have been published. Our contribution to this work is a thorough examination of the dynamics of research itself: the challenges, approaches, pitfalls, fallacies, and research gaps, without ignoring the research results. The prospective researcher who wants to understand the dynamics of research into predictive modelling and supporting measurements of power consumption by individual VEs relevant to the telco cloud is our target audience. A thorough frequency analysis is used to characterise dynamics, which we do using a novel method we developed that is unique in its ability to parse research literature. A prospective researcher can obtain a thorough characterization of the problems, approaches, developments, formal methods, pitfalls, fallacies, and research gaps that characterise this research space by using the visual aids we provide and our observations through cross-cutting themes. Among the themes identified by our survey, we noted that all of the problem categories we identified touch on one or more of a set of seven major variables that may affect power consumption by virtual entities and the resulting model representations: workload type, virtualization agent (VM or container) characteristics, host machine resources and architecture, temperature, operating frequency, attribution of a fraction of consumed power.
DOI: 10.37421/0974-7230.2022.15.448
DOI: 10.37421/0974-7230.2022.15.449
DOI: 10.37421/0974-7230.2022.15.450
DOI: 10.37421/0974-7230.2022.15.451
DOI: 10.37421/0974-7230.2022.15.442
DOI: 10.37421/0974-7230.2022.15.443
DOI: 10.37421/0974-7230.2022.15.444
DOI: 10.37421/0974-7230.2022.15.445
Ultra-wideband (UWB) sensors use radio frequency technology to precisely determine position by wireless communication between devices. The most recent applications concentrate on locating and collecting sensor data from mobile phones, car keys, and other similar devices. However, this technology is still underutilised in the mining industry. This viewpoint provides implementation options and solutions to bridge this gap. It also assessed the advantages and disadvantages of using ultra-wideband for mining. The provided measurements were made with QORVO two-way ranging sensors and compared to theoretical and existing technological solutions. To ensure that UWB sensors are used optimally, special emphasis was placed on influencing factors such as UWB location methods and factors affecting measurement accuracies such as line of sight, multipath propagation, the effect of shielding, and the ideal measurement setup. An experiment revealed that when there is no multipath propagation and the arriving signal travels directly from the transmitter to the receiver, the results are the most accurate. Ultra-wideband (UWB) is a radio technology that allows for short-range, high-bandwidth communications at very low energy levels while covering a large portion of the radio spectrum. Even though it was introduced in 1901, this technology has many applications today. It is not linked to any frequency, unlike other similar technologies. UWB can also send data by utilising unused frequency capacity and a very wide frequency range. The minimum frequency range for UWB is 500 MHz. In general, the frequency response and pulse width of a signal determine the accuracy that can be achieved. The response frequency of UWB can range between 10 and 40 MHz, and the pulse width can be as short as one nanosecond, giving it a theoretical accuracy of one millimetre.
DOI: 10.37421/0974-7230.2022.15.446
DOI: 10.37421/0974-7230.2022.15.437
DOI: 10.37421/0974-7230.2022.15.438
High-performance Mn-Zn and Ni-Zn ferrites, amorphous, nanocrystalline, and metamaterials have been developed rapidly in recent years to meet the electromagnetic characteristics requirements of WPT systems. This paper begins with a comprehensive review of the magnetic materials used in WPT systems and concludes with cutting-edge WPT technology and the development and application of high-performance magnetic materials. Furthermore, this study provides an exclusive resource for researchers and engineers interested in learning about the technology and highlights critical issues that must be addressed. Finally, the potential challenges and opportunities of WPT magnetic materials are presented, and the technology's future development directions are predicted and discussed.
Because of its high transmission efficiency and acceptable transmission distance, the magnetic coupling resonant wireless power transfer (MCR-WPT) system is regarded as the most promising wireless power transfer (WPT) method. Magnetic cores made of magnetic materials are typically added to MCR-WPT systems to improve coupling performance in order to achieve magnetic concentration. However, as WPT technology advances, traditional magnetic materials gradually become a bottleneck, limiting system power density enhancement.
DOI: 10.37421/0974-7230.2022.15.439
DOI: 10.37421/0974-7230.2022.15.440
DOI: 10.37421/0974-7230.2022.15.441
DOI: 10.37421/0974-7230.2022.15.427
DOI: 10.37421/0974-7230.2022.15.428
DOI: 10.37421/0974-7230.2022.15.429
The pervasive or ubiquitous use of computing, i.e. the incorporation of sensor information, processing, and communication technology into everyday objects, has been integrated into a wide range of sports and exercise-related areas. Scientific experiments in motion studies conducted under valid ecological conditions; assistive technology assisting recreational and health-conscious athletes in their physical activities; and performance and tactical analyses of association football games providing analysis immediately following the end of a match, to name a few examples. The current state of the art in micro-electromechanical system inertial measurement units (MEMS IMU) and current approaches for data acquisition on human activities and sports are discussed. The integration of these sensors and cloud computing technologies is discussed later in the work, but first we discuss the benefits of wearable devices, which have gained a strong foothold in sport performance analysis when combined with mobile computing. We look at a variety of pervasive computing applications in sports performance and health. Among these applications are advances in computer vision with deep learning algorithms used to evaluate sports skills, promote injury prevention, and provide key performance indicators in a variety of sports. The progress in the integration of various sensors in wearable intelligent monitoring systems is broadly described. Sensor fusion is used specifically in sports and health monitoring to quantify exercise parameters and body response, provide classification of activities and movement patterns, estimate energy expenditure, and assess sleeping patterns.
DOI: 10.37421/0974-7230.2022.15.430
DOI: 10.37421/0974-7230.2022.15.431
DOI: 10.37421/0974-7230.2022.15.422
DOI: 10.37421/0974-7230.2022.15.423
DOI: 10.37421/0974-7230.2022.15.424
DOI: 10.37421/0974-7230.2022.15.425
Sensor-based sorting techniques have the potential to improve ore grades while reducing waste material processing. Previous research has shown that by discarding waste prior to further processing, sensor-based sorting can reduce energy, water, and reagent consumption, as well as fine waste production. Recent studies of sensor-based sorting and the fundamental mechanisms of the main sorting techniques are evaluated in this literature review to inform optimal sensor selection. Furthermore, the fusion of data from multiple sensing techniques is being investigated in order to improve characterization of the sensed material and thus sorting capability. The key to effective sensor-based sorting implementation was discovered to be the selection of a sensing technique capable of sensing a characteristic capable of separating ore from waste with a sampling distribution sufficient for the considered sorting method. Classifications of possible sensor fusion sorting applications in mineral processing are proposed and illustrated with case studies. It was also discovered that the main impediment to implementing sensor fusion is a lack of correlative data on the response of multiple sensing techniques to the same ore sample. To provide data for the evaluation and development of sensor fusion techniques, a combined approach of experimental testing supplemented by simulations is proposed.
DOI: 10.37421/0974-7230.2022.15.426
DOI: 10.37421/0974-7230.2022.15.417
Using machine learning techniques in smart farming is gaining momentum worldwide. The main goal is to achieve better production by predicting right crop considering present conditions of weather and soil. The climatic changes that are being uncertain results in reduced yield when the farmers follow traditional way of growing crops. The features of soil and conditions of the weather change time to time and this criteria when concentrated leads to precision farming. The study implements machine learning techniques to predict the right crop for cultivation, expecting better yield, taking into account the changes in the weather and soil every time, as and when the farmer aims for growing a fresh crop.
DOI: 10.37421/0974-7230.2022.15.418
DOI: 10.37421/0974-7230.2022.15.419
DOI: 10.37421/ 0974-7230.2022.15.420
DOI: 10.37421/0974-7230.2022.15.421
DOI: 10.37421/0974-7230.2022.15.413
DOI: 10.37421/0974-7230.2022.15.414
DOI: 10.37421/0974-7230.2022.15.412
DOI: 10.37421/0974-7230.2022.15.415
DOI: 10.37421/0974-7230.2022.15.416
Jeffrey E. Arle* and Longzhi Mei
DOI: 10.37421/0974-7230.2022.15.407
The stability of evolutionary systems is critical to their survival and reproductive success. In the face of continuous extrinsic and intrinsic stimuli, biological systems must evolve to perform robustly within sub-regions of parameter and trajectory spaces that are often astronomical in magnitude, characterized as homeostasis over a century ago in medicine and later in cybernetics and control theory. Various evolutionary design strategies for robustness have evolved and are conserved across species, such as redundancy, modularity, and hierarchy. We investigate the hypothesis that a strategy for robustness is in evolving neural circuitry network components and topology such that increasing the number of components results in greater system stability. As measured by a center of maximum curvature method related to firing rates, the transition of the neural circuitry systems model to a robust state was ~153 network connections (network degree).
DOI: 10.37421/0974-7230.2022.15.408
DOI: 10.37421/0974-7230.2022.15.409
DOI: 10.37421/0974-7230.2022.15.410.
DOI: 10.37421/ 0974-7230.2022.15.411
DOI: 10.37421/jcsb.2022.15.403
DOI: 10.37421/jcsb.2022.15.405
DOI: 10.37421/jcsb.2022.15.404
DOI: 10.37421/jcsb.2022.15.403
DOI: 10.37421/jcsb.2022.15.406
DOI: 10.37421/ jms.2022.11. 392.
DOI: 10.37421/ jms.2022.11. 393.
DOI: 10.37421/ jms.2022.11. 394.
DOI: 10.37421/ jms.2022.11. 395.
DOI: 10.37421/ jms.2022.11. 396.
DOI: 10.37421/0974-7230.2024.17.506
DOI: 10.37421/0974-7230.2024.17.508
In the realm of medical research, the intersection of biology and computer science has given rise to a groundbreaking field known as computational immunology. This burgeoning discipline seeks to unravel the complexities of the immune system using advanced algorithms and computational models. The immune system, an intricate network of cells and proteins, plays a crucial role in defending the body against pathogens and maintaining overall health. Computational immunology leverages the power of data analysis, mathematical modeling, and machine learning to decipher the intricacies of immune responses, paving the way for innovative approaches to understanding diseases and developing targeted therapeutic interventions.
DOI: 10.37421/0974-7230.2024.17.509
In the rapidly evolving landscape of biological research, the integration of computer science techniques into systems biology has emerged as a transformative force, offering novel insights into the complexities of living organisms. This synergy between computer science and biology holds the promise of unraveling intricate biological networks, predicting cellular behaviors, and accelerating drug discovery processes. However, this integration is not without its challenges. In this article, we explore the hurdles and opportunities associated with the convergence of computer science and systems biology, shedding light on the path ahead for researchers in these interdisciplinary fields.
DOI: 10.37421/0974-7230.2024.17.510
In the fast-evolving landscape of scientific research, particularly in the realm of systems biology, the need for robust data management and security has become increasingly crucial. Traditional methods of data storage and management have faced challenges related to security breaches, data integrity, and transparency. The integration of blockchain technology in bioinformatics offers a promising solution to address these concerns, providing a decentralized and secure framework for managing and sharing sensitive biological data. In this article, we delve into the intersection of blockchain and bioinformatics, exploring the potential benefits, challenges, and applications of blockchain in enhancing security and transparency in systems biology research.
DOI: 10.37421/0974-7230.2024.17.511
DOI: 10.37421/0974-7230.2024.17.502
DOI: 10.37421/0974-7230.2024.17.503
DOI: 10.37421/0974-7230.2024.17.507
In the realm of modern biology, two groundbreaking fields, gene editing through CRISPR technology and computational systems biology, are converging to revolutionize our understanding of genetics and biological systems. This article explores the powerful synergy between CRISPR and code, highlighting how the integration of gene-editing techniques with computational approaches is reshaping the landscape of systems biology. CRISPR, or Clustered Regularly Interspaced Short Palindromic Repeats, was initially discovered as a part of the bacterial immune system. However, scientists have ingeniously adapted and repurposed this system into a revolutionary gene-editing tool. Concurrently, the field of systems biology has emerged as a holistic approach to studying the complex interactions within biological systems. The fusion of CRISPR and computational systems biology is transforming the way we explore and manipulate the fundamental building blocks of life.
DOI: 10.37421/0974-7230.2024.17.504
DOI: 10.37421/0974-7230.2024.17.505
DOI: 10.37421/0974-7230.2023.16.467
DOI: 10.37421/0974-7230.2023.16.468
DOI: 10.37421/0974-7230.2023.16.469
DOI: 10.37421/0974-7230.2023.16.470
DOI: 10.37421/0974-7230.2023.16.471
With the increasing dependence on the internet for various activities, ensuring robust security measures has become paramount. Blockchain technology has emerged as a promising solution for enhancing internet security due to its decentralized and immutable nature. This systematic review aims to provide a comprehensive understanding of the role of blockchain technology in strengthening internet security. Through a thorough analysis of existing literature, this review explores the key concepts, applications, and challenges associated with the integration of blockchain in internet security. The findings highlight the potential of blockchain technology in enhancing data integrity, authentication, access control, and privacy protection. Furthermore, this review discusses the limitations and future research directions in this domain.
DOI: 10.37421/0974-7230.2023.16.462
Deep Neural Networks (DNNs) have achieved remarkable success in various domains, ranging from computer vision to natural language processing. However, their increasing complexity poses challenges in terms of model size, memory requirements, and computational costs. To address these issues, researchers have turned their attention to sparsity, a technique that introduces structural zeros into the network, thereby reducing redundancy and improving efficiency. This research article explores the role of sparsity in DNNs and its impact on performance improvement. We review existing literature, discuss sparsity-inducing methods, and analyze the benefits and trade-offs associated with sparse networks. Furthermore, we present experimental results that demonstrate the effectiveness of sparsity in improving performance metrics such as accuracy, memory footprint, and computational efficiency. Our findings highlight the potential of sparsity as a powerful tool for optimizing DNNs and provide insights into future research directions in this field.
DOI: 10.37421/0974-7230.2023.16.463
DOI: 10.37421/0974-7230.2023.16.464
DOI: 10.37421/0974-7230.2023.16.465
DOI: 10.37421/0974-7230.2023.16.466
DOI: 10.37421/ jcsb.2022.15.399
DOI: 10.37421/ jcsb.2022.15. 400
DOI: 10.37421/ jcsb.2022.15.398
DOI: 10.37421/ jcsb.2022.15. 397
DOI: 10.37421/ jcsb.2022.15.401
DOI: 10.37421/0974-7230.2023.16.460
DOI: 10.37421/0974-7230.2023.16.461
DOI: 10.37421/0974-7230.2023.16.457
DOI: 10.37421/0974-7230.2023.16.459
DOI: 10.37421/0974-7230.2023.16.458
Cloud computing has emerged as a paradigm shift in computing, providing scalable and on-demand access to computing resources. With the increasing demand for cloud services, efficient resource utilization and load balancing have become critical factors for optimizing performance and ensuring cost-effectiveness. This research article explores dynamic load balancing techniques in cloud computing to enhance resource utilization and improve overall system efficiency. We examine various load balancing algorithms and strategies employed in cloud environments, highlighting their advantages, limitations, and areas of application. Through an in-depth analysis of the literature, this article aims to provide valuable insights into the dynamic load balancing techniques that can be leveraged to achieve optimal resource utilization in cloud computing.
Journal of Computer Science & Systems Biology received 2279 citations as per Google Scholar report