picture
RJR-logo

About | BLOGS | Portfolio | Misc | Recommended | What's New | What's Hot

About | BLOGS | Portfolio | Misc | Recommended | What's New | What's Hot

icon

Bibliography Options Menu

icon
QUERY RUN:
30 Mar 2020 at 01:34
HITS:
1441
PAGE OPTIONS:
Hide Abstracts   |   Hide Additional Links
NOTE:
Long bibliographies are displayed in blocks of 100 citations at a time. At the end of each block there is an option to load the next block.

Bibliography on: Cloud Computing

RJR-3x

Robert J. Robbins is a biologist, an educator, a science administrator, a publisher, an information technologist, and an IT leader and manager who specializes in advancing biomedical knowledge and supporting education through the application of information technology. More About:  RJR | OUR TEAM | OUR SERVICES | THIS WEBSITE

ESP: PubMed Auto Bibliography 30 Mar 2020 at 01:34 Created: 

Cloud Computing

Wikipedia: Cloud Computing Cloud computing is the on-demand availability of computer system resources, especially data storage and computing power, without direct active management by the user. Cloud computing relies on sharing of resources to achieve coherence and economies of scale. Advocates of public and hybrid clouds note that cloud computing allows companies to avoid or minimize up-front IT infrastructure costs. Proponents also claim that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and that it enables IT teams to more rapidly adjust resources to meet fluctuating and unpredictable demand, providing the burst computing capability: high computing power at certain periods of peak demand. Cloud providers typically use a "pay-as-you-go" model, which can lead to unexpected operating expenses if administrators are not familiarized with cloud-pricing models. The possibility of unexpected operating expenses is especially problematic in a grant-funded research institution, where funds may not be readily available to cover significant cost overruns.

Created with PubMed® Query: cloud[TIAB] and (computing[TIAB] or "amazon web services"[TIAB] or google[TIAB] or "microsoft azure"[TIAB]) NOT pmcbook NOT ispreviousversion

Citations The Papers (from PubMed®)

RevDate: 2020-03-12
CmpDate: 2020-01-24

Kaya MC, Saeedi Nikoo M, Schwartz ML, et al (2020)

Internet of Measurement Things Architecture: Proof of Concept with Scope of Accreditation.

Sensors (Basel, Switzerland), 20(2):.

Many industries, such as manufacturing, aviation, and power generation, employ sensitive measurement devices to be calibrated by certified experts. The diversity and sophistication of measurement devices and their calibration needs require networked and automated solutions. Internet of Measurement Things (IoMT) is an architectural framework that is based on the Industrial Internet of Things for the calibration industry. This architecture involves a layered model with a cloud-centric middle layer. In this article, the realization of this conceptual architecture is described. The applicability of the IoMT architecture in the calibration industry is shown through an editor application for Scope of Accreditation. The cloud side of the implementation is deployed to Microsoft Azure. The editor itself is created as a cloud service, and IoT Hub is used to collect data from calibration laboratories. By adapting the IoMT architecture to a commonly used cloud platform, considerable progress is achieved to encompass Metrology data and serve the majority of the stakeholders.

RevDate: 2020-03-27

Sandhu R, Sood SK, G Kaur (2016)

An intelligent system for predicting and preventing MERS-CoV infection outbreak.

The Journal of supercomputing, 72(8):3033-3056.

MERS-CoV is an airborne disease which spreads easily and has high death rate. To predict and prevent MERS-CoV, real-time analysis of user's health data and his/her geographic location are fundamental. Development of healthcare systems using cloud computing is emerging as an effective solution having benefits of better quality of service, reduced cost, scalability, and flexibility. In this paper, an effective cloud computing system is proposed which predicts MERS-CoV-infected patients using Bayesian belief network and provides geographic-based risk assessment to control its outbreak. The proposed system is tested on synthetic data generated for 0.2 million users. System provided high accuracy for classification and appropriate geographic-based risk assessment. The key point of this paper is the use of geographic positioning system to represent each MERS-CoV users on Google maps so that possibly infected users can be quarantined as early as possible. It will help uninfected citizens to avoid regional exposure and the government agencies to manage the problem more effectively.

RevDate: 2020-03-24

Pérez de Prado R, García-Galán S, Muñoz-Expósito JE, et al (2020)

Smart Containers Schedulers for Microservices Provision in Cloud-Fog-IoT Networks. Challenges and Opportunities.

Sensors (Basel, Switzerland), 20(6): pii:s20061714.

Docker containers are the lightweight-virtualization technology prevailing today for the provision of microservices. This work raises and discusses two main challenges in Docker containers' scheduling in cloud-fog-internet of things (IoT) networks. First, the convenience to integrate intelligent containers' schedulers based on soft-computing in the dominant open-source containers' management platforms: Docker Swarm, Google Kubernetes and Apache Mesos. Secondly, the need for specific intelligent containers' schedulers for the different interfaces in cloud-fog-IoT networks: cloud-to-fog, fog-to-IoT and cloud-to-fog. The goal of this work is to support the optimal allocation of microservices provided by the main cloud service providers today and used by millions of users worldwide in applications such as smart health, content delivery networks, smart health, etc. Particularly, the improvement is studied in terms of quality of service (QoS) parameters such as latency, load balance, energy consumption and runtime, based on the analysis of previous works and implementations. Moreover, the scientific-technical impact of smart containers' scheduling in the market is also discussed, showing the possible repercussion of the raised opportunities in the research line.

RevDate: 2020-03-17

Fan X, Zheng H, Jiang R, et al (2020)

Optimal Design of Hierarchical Cloud-Fog&Edge Computing Networks with Caching.

Sensors (Basel, Switzerland), 20(6): pii:s20061582.

This paper investigates the optimal design of a hierarchical cloud-fog&edge computing (FEC) network, which consists of three tiers, i.e., the cloud tier, the fog&edge tier, and the device tier. The device in the device tier processes its task via three computing modes, i.e., cache-assisted computing mode, cloud-assisted computing mode, and joint device-fog&edge computing mode. Specifically, the task corresponds to being completed via the content caching in the FEC tier, the computation offloading to the cloud tier, and the joint computing in the fog&edge and device tier, respectively. For such a system, an energy minimization problem is formulated by jointly optimizing the computing mode selection, the local computing ratio, the computation frequency, and the transmit power, while guaranteeing multiple system constraints, including the task completion deadline time, the achievable computation capability, and the achievable transmit power threshold. Since the problem is a mixed integer nonlinear programming problem, which is hard to solve with known standard methods, it is decomposed into three subproblems, and the optimal solution to each subproblem is derived. Then, an efficient optimal caching, cloud, and joint computing (CCJ) algorithm to solve the primary problem is proposed. Simulation results show that the system performance achieved by our proposed optimal design outperforms that achieved by the benchmark schemes. Moreover, the smaller the achievable transmit power threshold of the device, the more energy is saved. Besides, with the increment of the data size of the task, the lesser is the local computing ratio.

RevDate: 2020-03-16

Ma R, Mei H, Guan H, et al (2020)

LADV: Deep Learning Assisted Authoring of Dashboard Visualizations from Images and Sketches.

IEEE transactions on visualization and computer graphics [Epub ahead of print].

Dashboard visualizations are widely used in data-intensive applications such as business intelligence, operation monitoring, and urban planning. However, existing visualization authoring tools are inefficient in the rapid prototyping of dashboards because visualization expertise and user intention need to be integrated. We propose a novel approach to rapid conceptualization that can construct dashboard templates from exemplars to mitigate the burden of designing, implementing, and evaluating dashboard visualizations. The kernel of our approach is a novel deep learning-based model that can identify and locate charts of various categories and extract colors from an input image or sketch. We design and implement a web-based authoring tool for learning, composing, and customizing dashboard visualizations in a cloud computing environment. Examples, user studies, and user feedback from real scenarios in Alibaba Cloud verify the usability and efficiency of the proposed approach.

RevDate: 2020-03-13

Xiao D, Li M, H Zheng (2020)

Smart Privacy Protection for Big Video Data Storage Based on Hierarchical Edge Computing.

Sensors (Basel, Switzerland), 20(5): pii:s20051517.

Recently, the rapid development of the Internet of Things (IoT) has led to an increasing exponential growth of non-scalar data (e.g., images, videos). Local services are far from satisfying storage requirements, and the cloud computing fails to effectively support heterogeneous distributed IoT environments, such as wireless sensor network. To effectively provide smart privacy protection for video data storage, we take full advantage of three patterns (multi-access edge computing, cloudlets and fog computing) of edge computing to design the hierarchical edge computing architecture, and propose a low-complexity and high-secure scheme based on it. The video is divided into three parts and stored in completely different facilities. Specifically, the most significant bits of key frames are directly stored in local sensor devices while the least significant bits of key frames are encrypted and sent to the semi-trusted cloudlets. The non-key frame is compressed with the two-layer parallel compressive sensing and encrypted by the 2D logistic-skew tent map and then transmitted to the cloud. Simulation experiments and theoretical analysis demonstrate that our proposed scheme can not only provide smart privacy protection for big video data storage based on the hierarchical edge computing, but also avoid increasing additional computation burden and storage pressure.

RevDate: 2020-03-10

Yin S, Wu W, Zhao X, et al (2020)

Understanding spatiotemporal patterns of global forest NPP using a data-driven method based on GEE.

PloS one, 15(3):e0230098 pii:PONE-D-18-34082.

Spatiotemporal patterns of global forest net primary productivity (NPP) are pivotal for us to understand the interaction between the climate and the terrestrial carbon cycle. In this study, we use Google Earth Engine (GEE), which is a powerful cloud platform, to study the dynamics of the global forest NPP with remote sensing and climate datasets. In contrast with traditional analyses that divide forest areas according to geographical location or climate types to retrieve general conclusions, we categorize forest regions based on their NPP levels. Nine categories of forests are obtained with the self-organizing map (SOM) method, and eight relative factors are considered in the analysis. We found that although forests can achieve higher NPP with taller, denser and more broad-leaved trees, the influence of the climate is stronger on the NPP; for the high-NPP categories, precipitation shows a weak or negative correlation with vegetation greenness, while lacking water may correspond to decrease in productivity for low-NPP categories. The low-NPP categories responded mainly to the La Niña event with an increase in the NPP, while the NPP of the high-NPP categories increased at the onset of the El Niño event and decreased soon afterwards when the warm phase of the El Niño-Southern Oscillation (ENSO) wore off. The influence of the ENSO changes correspondingly with different NPP levels, which infers that the pattern of climate oscillation and forest growth conditions have some degree of synchronization. These findings may facilitate the understanding of global forest NPP variation from a different perspective.

RevDate: 2020-02-28

Alexander K, Hanif M, Lee C, et al (2020)

Cost-aware orchestration of applications over heterogeneous clouds.

PloS one, 15(2):e0228086 pii:PONE-D-19-20789.

The orchestration of applications and their components over heterogeneous clouds is recognized as being critical in solving the problem of vendor lock-in with regards to distributed and cloud computing. There have been recent strides made in the area of cloud application orchestration with emergence of the TOSCA standard being a definitive one. Although orchestration by itself provides a considerable amount of benefit to consumers of cloud computing services, it remains impractical without a compelling reason to ensure its utilization by cloud computing consumers. If there is no measurable benefit in using orchestration, then it is likely that clients may opt out of using it altogether. In this paper, we present an approach to cloud orchestration that aims to combine an orchestration model with a cost and policy model in order to allow for cost-aware application orchestration across heterogeneous clouds. Our approach takes into consideration the operating cost of the application on each provider, while performing a forward projection of the operating cost over a period of time to ensure that cost constraints remain unviolated. This allows us to leverage the existing state of the art with regards to orchestration and model-driven approaches as well as tie it to the operations of cloud clients in order to improve utility. Through this study, we were able to show that our approach was capable of providing not only scaling features but also orchestration features of application components distributed across heterogeneous cloud platforms.

RevDate: 2020-03-04

Matsuzawa NN, Arai H, Sasago M, et al (2020)

Massive Theoretical Screen of Hole Conducting Organic Materials in the Heteroacene Family by Using a Cloud-Computing Environment.

The journal of physical chemistry. A [Epub ahead of print].

Materials exhibiting higher mobilities than conventional organic semiconducting materials such as fullerenes and fused thiophenes are in high demand for applications in printed electronics. To discover new molecules in the heteroacene family that might show improved charge mobility, a massive theoretical screen of hole conducting properties of molecules was performed by using a cloud-computing environment. Over 7 000 000 structures of fused furans, thiophenes and selenophenes were generated and 250 000 structures were randomly selected to perform density functional theory (DFT) calculations of hole reorganization energies. The lowest hole reorganization energy calculated was 0.0548 eV for a fused thioacene having 8 aromatics rings. Hole mobilities of compounds with the lowest 130 reorganization energy were further processed by applying combined DFT and molecular dynamics (MD) methods. The highest mobility calculated was 1.02 and 9.65 cm2/(V s) based on percolation and disorder theory, respectively, for compounds containing selenium atoms with 8 aromatic rings. These values are about 20 times higher than those for dinaphthothienothiophene (DNTT).

RevDate: 2020-03-07

Silvestre-Blanes J, Sempere-Payá V, T Albero-Albero (2020)

Smart Sensor Architectures for Multimedia Sensing in IoMT.

Sensors (Basel, Switzerland), 20(5): pii:s20051400.

Today, a wide range of developments and paradigms require the use of embedded systems characterized by restrictions on their computing capacity, consumption, cost, and network connection. The evolution of the Internet of Things (IoT) towards Industrial IoT (IIoT) or the Internet of Multimedia Things (IoMT), its impact within the 4.0 industry, the evolution of cloud computing towards edge or fog computing, also called near-sensor computing, or the increase in the use of embedded vision, are current examples of this trend. One of the most common methods of reducing energy consumption is the use of processor frequency scaling, based on a particular policy. The algorithms to define this policy are intended to obtain good responses to the workloads that occur in smarthphones. There has been no study that allows a correct definition of these algorithms for workloads such as those expected in the above scenarios. This paper presents a method to determine the operating parameters of the dynamic governor algorithm called Interactive, which offers significant improvements in power consumption, without reducing the performance of the application. These improvements depend on the load that the system has to support, so the results are evaluated against three different loads, from higher to lower, showing improvements ranging from 62% to 26%.

RevDate: 2020-03-06

Lin HC, Kuo YC, MY Liu (2020)

A health informatics transformation model based on intelligent cloud computing - exemplified by type 2 diabetes mellitus with related cardiovascular diseases.

Computer methods and programs in biomedicine, 191:105409 pii:S0169-2607(19)31715-8 [Epub ahead of print].

BACKGROUND AND OBJECTIVE: Many studies regarding health analysis request structured datasets but the legacy resources provide scattered data. This study aims to establish a health informatics transformation model (HITM) based upon intelligent cloud computing with the self-developed analytics modules by open source technique. The model was exemplified by the open data of type 2 diabetes mellitus (DM2) with related cardiovascular diseases.

METHODS: The Apache-SPARK framework was employed to generate the infrastructure of the HITM, which enables the machine learning (ML) algorithms including random forest, multi-layer perceptron classifier, support vector machine, and naïve Bayes classifier as well as the regression analysis for intelligent cloud computing. The modeling applied the MIMIC-III open database as an example to design the health informatics data warehouse, which embeds the PL/SQL-based modules to extract the analytical data for the training processes. A coupling analysis flow can drive the ML modules to train the sample data and validate the results.

RESULTS: The four modes of cloud computation were compared to evaluate the feasibility of the cloud platform in accordance with its system performance for more than 11,500 datasets. Then, the modeling adaptability was validated by simulating the featured datasets of obesity and cardiovascular-related diseases for patients with DM2 and its complications. The results showed that the run-time efficiency of the platform performed in around one minute and the prediction accuracy of the featured datasets reached 90%.

CONCLUSIONS: This study helped contribute the modeling for efficient transformation of health informatics. The HITM can be customized for the actual clinical database, which provides big data for training, with the proper ML modules for a predictable process in the cloud platform. The feedback of intelligent computing can be referred to risk assessment in health promotion.

RevDate: 2020-03-06

Li X, Zhang X, Qiu C, et al (2020)

Rapid Loss of Tidal Flats in the Yangtze River Delta since 1974.

International journal of environmental research and public health, 17(5): pii:ijerph17051636.

As the home to national nature reserves and a Ramsar wetland, the tidal flats of the Yangtze River Delta are of great significance for ecological security, at both the local and global scales. However, a comprehensive understanding of the spatiotemporal conditions of the tidal flats in the Yangtze River Delta remains lacking. Here, we propose using remote sensing to obtain a detailed spatiotemporal profile of the tidal flats, using all available Landsat images from 1974 to 2018 with the help of the Google Earth Engine cloud platform. In addition, reclamation data were manually extracted from time series Landsat images for the same period. We found that approximately 40.0% (34.9-43.1%) of the tidal flats in the study area have been lost since 1980, the year in which the tidal flat area was maximal. The change in the tidal flat areas was consistent with the change in the riverine sediment supply. We also found that the cumulative reclamation areas totaled 816.6 km2 and 431.9 km2 in the Yangtze estuary zone and along the Jiangsu coast, respectively, between 1974 and 2018. Because of reclamation, some areas (e.g., the Hengsha eastern shoal and Pudong bank), which used to be quite rich, have lost most of their tidal flats. Currently, almost 70% of the remaining tidal flats are located in the shrinking branch (North Branch) and the two National Nature Reserves (Chongming Dongtan and Jiuduansha) in the Yangtze estuary zone. Consequently, the large-scale loss of tidal flats observed was primarily associated with reduced sediment supply and land reclamation at the time scale of the study. Because increasing demand for land and rising sea levels are expected in the future, immediate steps should be taken to prevent the further deterioration of this valuable ecosystem.

RevDate: 2020-03-05

Schwengers O, Hoek A, Fritzenwanker M, et al (2020)

ASA3P: An automatic and scalable pipeline for the assembly, annotation and higher level analysis of closely related bacterial isolates.

PLoS computational biology, 16(3):e1007134 pii:PCOMPBIOL-D-19-00828 [Epub ahead of print].

Whole genome sequencing of bacteria has become daily routine in many fields. Advances in DNA sequencing technologies and continuously dropping costs have resulted in a tremendous increase in the amounts of available sequence data. However, comprehensive in-depth analysis of the resulting data remains an arduous and time-consuming task. In order to keep pace with these promising but challenging developments and to transform raw data into valuable information, standardized analyses and scalable software tools are needed. Here, we introduce ASA3P, a fully automatic, locally executable and scalable assembly, annotation and analysis pipeline for bacterial genomes. The pipeline automatically executes necessary data processing steps, i.e. quality clipping and assembly of raw sequencing reads, scaffolding of contigs and annotation of the resulting genome sequences. Furthermore, ASA3P conducts comprehensive genome characterizations and analyses, e.g. taxonomic classification, detection of antibiotic resistance genes and identification of virulence factors. All results are presented via an HTML5 user interface providing aggregated information, interactive visualizations and access to intermediate results in standard bioinformatics file formats. We distribute ASA3P in two versions: a locally executable Docker container for small-to-medium-scale projects and an OpenStack based cloud computing version able to automatically create and manage self-scaling compute clusters. Thus, automatic and standardized analysis of hundreds of bacterial genomes becomes feasible within hours. The software and further information is available at: asap.computational.bio.

RevDate: 2020-03-05

Guzzi F, De Bortoli L, Molina RS, et al (2020)

Distillation of an End-to-End Oracle for Face Verification and Recognition Sensors.

Sensors (Basel, Switzerland), 20(5): pii:s20051369.

Face recognition functions are today exploited through biometric sensors in many applications, from extended security systems to inclusion devices; deep neural network methods are reaching in this field stunning performances. The main limitation of the deep learning approach is an inconvenient relation between the accuracy of the results and the needed computing power. When a personal device is employed, in particular, many algorithms require a cloud computing approach to achieve the expected performances; other algorithms adopt models that are simple by design. A third viable option consists of model (oracle) distillation. This is the most intriguing among the compression techniques since it permits to devise of the minimal structure that will enforce the same I/O relation as the original model. In this paper, a distillation technique is applied to a complex model, enabling the introduction of fast state-of-the-art recognition capabilities on a low-end hardware face recognition sensor module. Two distilled models are presented in this contribution: the former can be directly used in place of the original oracle, while the latter incarnates better the end-to-end approach, removing the need for a separate alignment procedure. The presented biometric systems are examined on the two problems of face verification and face recognition in an open set by using well-agreed training/testing methodologies and datasets.

RevDate: 2020-03-04

Gu D, Yang X, Deng S, et al (2020)

Tracking Knowledge Evolution in Cloud Health Care Research: Knowledge Map and Common Word Analysis.

Journal of medical Internet research, 22(2):e15142 pii:v22i2e15142.

BACKGROUND: With the continuous development of the internet and the explosive growth in data, big data technology has emerged. With its ongoing development and application, cloud computing technology provides better data storage and analysis. The development of cloud health care provides a more convenient and effective solution for health. Studying the evolution of knowledge and research hotspots in the field of cloud health care is increasingly important for medical informatics. Scholars in the medical informatics community need to understand the extent of the evolution of and possible trends in cloud health care research to inform their future research.

OBJECTIVE: Drawing on the cloud health care literature, this study aimed to describe the development and evolution of research themes in cloud health care through a knowledge map and common word analysis.

METHODS: A total of 2878 articles about cloud health care was retrieved from the Web of Science database. We used cybermetrics to analyze and visualize the keywords in these articles. We created a knowledge map to show the evolution of cloud health care research. We used co-word analysis to identify the hotspots and their evolution in cloud health care research.

RESULTS: The evolution and development of cloud health care services are described. In 2007-2009 (Phase I), most scholars used cloud computing in the medical field mainly to reduce costs, and grid computing and cloud computing were the primary technologies. In 2010-2012 (Phase II), the security of cloud systems became of interest to scholars. In 2013-2015 (Phase III), medical informatization enabled big data for health services. In 2016-2017 (Phase IV), machine learning and mobile technologies were introduced to the medical field.

CONCLUSIONS: Cloud health care research has been rapidly developing worldwide, and technologies used in cloud health research are simultaneously diverging and becoming smarter. Cloud-based mobile health, cloud-based smart health, and the security of cloud health data and systems are three possible trends in the future development of the cloud health care field.

RevDate: 2020-03-03

Hadley TD, Pettit RW, Malik T, et al (2020)

Artificial Intelligence in Global Health -A Framework and Strategy for Adoption and Sustainability.

International journal of MCH and AIDS, 9(1):121-127.

Artificial Intelligence (AI) applications in medicine have grown considerably in recent years. AI in the forms of Machine Learning, Natural Language Processing, Expert Systems, Planning and Logistics methods, and Image Processing networks provide great analytical aptitude. While AI methods were first conceptualized for radiology, investigations today are established across all medical specialties. The necessity for proper infrastructure, skilled labor, and access to large, well-organized data sets has kept the majority of medical AI applications in higher-income countries. However, critical technological improvements, such as cloud computing and the near-ubiquity of smartphones, have paved the way for use of medical AI applications in resource-poor areas. Global health initiatives (GHI) have already begun to explore ways to leverage medical AI technologies to detect and mitigate public health inequities. For example, AI tools can help optimize vaccine delivery and community healthcare worker routes, thus enabling limited resources to have a maximal impact. Other promising AI tools have demonstrated an ability to: predict burn healing time from smartphone photos; track regions of socioeconomic disparity combined with environmental trends to predict communicable disease outbreaks; and accurately predict pregnancy complications such as birth asphyxia in low resource settings with limited patient clinical data. In this commentary, we discuss the current state of AI-driven GHI and explore relevant lessons from past technology-centered GHI. Additionally, we propose a conceptual framework to guide the development of sustainable strategies for AI-driven GHI, and we outline areas for future research.

RevDate: 2020-03-03

Tariq MI, Ahmed S, Memon NA, et al (2020)

Prioritization of Information Security Controls through Fuzzy AHP for Cloud Computing Networks and Wireless Sensor Networks.

Sensors (Basel, Switzerland), 20(5): pii:s20051310.

With the advent of cloud computing and wireless sensor networks, the number of cyberattacks has rapidly increased. Therefore, the proportionate security of networks has become a challenge for organizations. Information security advisors of organizations face difficult and complex decisions in the evaluation and selection of information security controls that permit the defense of their resources and assets. Information security controls must be selected based on an appropriate level of security. However, their selection needs intensive investigation regarding vulnerabilities, risks, and threats prevailing in the organization as well as consideration of the implementation, mitigation, and budgetary constraints of the organization. The goal of this paper was to improve the information security control analysis method by proposing a formalized approach, i.e., fuzzy Analytical Hierarchy Process (AHP). This approach was used to prioritize and select the most relevant set of information security controls to satisfy the information security requirements of an organization. We argue that the prioritization of the information security controls using fuzzy AHP leads to an efficient and cost-effective assessment and evaluation of information security controls for an organization in order to select the most appropriate ones. The proposed formalized approach and prioritization processes are based on International Organization for Standardization and the International Electrotechnical Commission (ISO/IEC) 27001:2013. But in practice, organizations may apply this approach to any information security baseline manual.

RevDate: 2020-03-03

Liu Z, Zhang J, Li Y, et al (2020)

Hierarchical MEC Servers Deployment and User-MEC Server Association in C-RANs over WDM Ring Networks.

Sensors (Basel, Switzerland), 20(5): pii:s20051282.

With the increasing number of Internet of Things (IoT) devices, a huge amount of latency-sensitive and computation-intensive IoT applications have been injected into the network. Deploying mobile edge computing (MEC) servers in cloud radio access network (C-RAN) is a promising candidate, which brings a number of critical IoT applications to the edge network, to reduce the heavy traffic load and the end-to-end latency. The MEC server's deployment mechanism is highly related to the user allocation. Therefore, in this paper, we study hierarchical deployment of MEC servers and user allocation problem. We first formulate the problem as a mixed integer nonlinear programming (MINLP) model to minimize the deployment cost and average latency. In terms of the MINLP model, we then propose an enumeration algorithm and approximate algorithm based on the improved entropy weight and TOPSIS methods. Numerical results show that the proposed algorithms can reduce the total cost, and the approximate algorithm has lower total cost comparing the heaviest-location first and the latency-based algorithms.

RevDate: 2020-03-02

Dumont ELP, Tycko B, C Do (2020)

CloudASM: an ultra-efficient cloud-based pipeline for mapping allele-specific DNA methylation.

Bioinformatics (Oxford, England) pii:5771329 [Epub ahead of print].

SUMMARY: Methods for quantifying the imbalance in CpG methylation between alleles genome-wide have been described but their algorithmic time complexity is quadratic and their practical use requires painstaking attention to infrastructure choice, implementation, and execution. To solve this problem, we developed CloudASM, a scalable, ultra-efficient, turn-key, portable pipeline on Google Cloud Computing (GCP) that uses a novel pipeline manager and GCP's serverless enterprise data warehouse.

CloudASM is freely available in the GitHub repository https://github.com/TyckoLab/CloudASM and a sample dataset and its results are also freely available at https://console.cloud.google.com/storage/browser/cloudasm.

RevDate: 2020-03-02

Alsulami OZ, Alahmadi AA, Saeed SOM, et al (2020)

Optimum resource allocation in optical wireless systems with energy-efficient fog and cloud architectures.

Philosophical transactions. Series A, Mathematical, physical, and engineering sciences, 378(2169):20190188.

Optical wireless communication (OWC) is a promising technology that can provide high data rates while supporting multiple users. The optical wireless (OW) physical layer has been researched extensively, however, less work was devoted to multiple access and how the OW front end is connected to the network. In this paper, an OWC system which employs a wavelength division multiple access (WDMA) scheme is studied, for the purpose of supporting multiple users. In addition, a cloud/fog architecture is proposed for the first time for OWC to provide processing capabilities. The cloud/fog-integrated architecture uses visible indoor light to create high data rate connections with potential mobile nodes. These OW nodes are further clustered and used as fog mini servers to provide processing services through the OW channel for other users. Additional fog-processing units are located in the room, the building, the campus and at the metro level. Further processing capabilities are provided by remote cloud sites. Two mixed-integer linear programming (MILP) models were proposed to numerically study networking and processing in OW systems. The first MILP model was developed and used to optimize resource allocation in the indoor OWC systems, in particular, the allocation of access points (APs) and wavelengths to users, while the second MILP model was developed to optimize the placement of processing tasks in the different fog and cloud nodes available. The optimization of tasks placement in the cloud/fog-integrated architecture was analysed using the MILP models. Multiple scenarios were considered where the mobile node locations were varied in the room and the amount of processing and data rate requested by each OW node was varied. The results help to identify the optimum colour and AP to use for communication for a given mobile node location and OWC system configuration, the optimum location to place processing and the impact of the network architecture. This article is part of the theme issue 'Optical wireless communication'.

RevDate: 2020-02-29

Fozoonmayeh D, Le HV, Wittfoth E, et al (2020)

A Scalable Smartwatch-Based Medication Intake Detection System Using Distributed Machine Learning.

Journal of medical systems, 44(4):76 pii:10.1007/s10916-019-1518-8.

Poor Medication adherence causes significant economic impact resulting in hospital readmission, hospital visits and other healthcare costs. The authors developed a smartwatch application and a cloud based data pipeline for developing a user-friendly medication intake monitoring system that can contribute to improving medication adherence. The developed Android smartwatch application collects activity sensor data using accelerometer and gyroscope. The cloud-based data pipeline includes distributed data storage, distributed database management system and distributed computing frameworks in order to build a machine learning model which identifies activity types using sensor data. With the proposed sensor data extraction, preprocessing and machine learning algorithms, this study successfully achieved a high F1 score of 0.977 with 13.313 seconds of training time and 0.139 seconds for testing.

RevDate: 2020-02-28

Campbell AD, Y Wang (2020)

Salt marsh monitoring along the mid-Atlantic coast by Google Earth Engine enabled time series.

PloS one, 15(2):e0229605 pii:PONE-D-19-25573.

Salt marshes provide a bulwark against sea-level rise (SLR), an interface between aquatic and terrestrial habitats, important nursery grounds for many species, a buffer against extreme storm impacts, and vast blue carbon repositories. However, salt marshes are at risk of loss from a variety of stressors such as SLR, nutrient enrichment, sediment deficits, herbivory, and anthropogenic disturbances. Determining the dynamics of salt marsh change with remote sensing requires high temporal resolution due to the spectral variability caused by disturbance, tides, and seasonality. Time series analysis of salt marshes can broaden our understanding of these changing environments. This study analyzed aboveground green biomass (AGB) in seven mid-Atlantic Hydrological Unit Code 8 (HUC-8) watersheds. The study revealed that the Eastern Lower Delmarva watershed had the highest average loss and the largest net reduction in salt marsh AGB from 1999-2018. The study developed a method that used Google Earth Engine (GEE) enabled time series of the Landsat archive for regional analysis of salt marsh change and identified at-risk watersheds and salt marshes providing insight into the resilience and management of these ecosystems. The time series were filtered by cloud cover and the Tidal Marsh Inundation Index (TMII). The combination of GEE enabled Landsat time series, and TMII filtering demonstrated a promising method for historic assessment and continued monitoring of salt marsh dynamics.

RevDate: 2020-02-28

Rodrigues VF, Paim EP, Kunst R, et al (2020)

Exploring publish/subscribe, multilevel cloud elasticity, and data compression in telemedicine.

Computer methods and programs in biomedicine, 191:105403 pii:S0169-2607(19)30060-4 [Epub ahead of print].

BACKGROUND AND OBJECTIVE: Multiple medical specialties rely on image data, typically following the Digital Imaging and Communications in Medicine (DICOM) ISO 12052 standard, to support diagnosis through telemedicine. Remote analysis by different physicians requires the same image to be transmitted simultaneously to different destinations in real-time. This scenario poses a need for a large number of resources to store and transmit DICOM images in real-time, which has been explored using some cloud-based solutions. However, these solutions lack strategies to improve the performance through the cloud elasticity feature. In this context, this article proposes a cloud-based publish/subscribe (PubSub) model, called PS2DICOM, which employs multilevel resource elasticity to improve the performance of DICOM data transmissions.

METHODS: A prototype is implemented to evaluate PS2DICOM. A PubSub communication model is adopted, considering the coexistence of two classes of users: (i) image data producers (publishers); and (ii) image data consumers (subscribers). PS2DICOM employs a cloud infrastructure to guarantee service availability and performance through resource elasticity in two levels of the cloud: (i) brokers and (ii) data storage. In addition, images are compressed prior to the transmission to reduce the demand for network resources using one of three different algorithms: (i) DEFLATE, (ii) LZMA, and (iii) BZIP2. PS2DICOM employs dynamic data compression levels at the client side to improve network performance according to the current available network throughput.

RESULTS: Results indicate that PS2DICOM can improve transmission quality, storage capabilities, querying, and retrieving of DICOM images. The general efficiency gain is approximately 35% in data sending and receiving operations. This gain is resultant from the two levels of elasticity, allowing resources to be scaled up or down automatically in a transparent manner.

CONCLUSIONS: The contributions of PS2DICOM are twofold: (i) multilevel cloud elasticity to adapt the computing resources on demand; (ii) adaptive data compression to meet the network quality and optimize data transmission. Results suggest that the use of compression in medical image data using PS2DICOM can improve the transmission efficiency, allowing the team of specialists to communicate in real-time, even when they are geographically distant.

RevDate: 2020-02-25

Li M, Tian T, Zeng Y, et al (2020)

An Individual Cloud-Based Fingerprint Operation Platform for Latent Fingerprint Identification Using Perovskite Nanocrystals as Eikonogen.

ACS applied materials & interfaces [Epub ahead of print].

Fingerprint formed through lifted papillary ridges is considered as the best reference for personal identification. However, the currently available latent fingerprint (LFP) images often suffer poor-resolution, low-degree information, and require multifarious steps for identification. Herein, an individual Cloud-based fingerprint operation platform has been designed and fabricated to achieve high-definition LFPs analysis by using CsPbBr3 perovskite nanocrystals (NCs) as eikonogen. Moreover, since CsPbBr3 NCs have a special response to some fingerprint-associated amino acids, the proposed platform can be further used to detect metabolites on LFPs. Consequently, in virtue of Cloud computing and artificial intelligence (AI), this study has demonstrated a champion platform to realize the whole LFPs identification analysis. In a double-blind simulative crime game, the enhanced LFP images can be easily obtained and used to lock the suspect accurately within one second on a smartphone, which can help investigators track the criminal clue and handle cases efficiently.

RevDate: 2020-02-21

Nkenyereye L, Nkenyereye L, Tama BA, et al (2020)

Software-Defined Vehicular Cloud Networks: Architecture, Applications and Virtual Machine Migration.

Sensors (Basel, Switzerland), 20(4): pii:s20041092.

Cloud computing supports many unprecedented cloud-based vehicular applications. To improve connectivity and bandwidth through programmable networking architectures, Software-Defined (SD) Vehicular Network (SDVN) is introduced. SDVN architecture enables vehicles to be equipped with SDN OpenFlow switch on which the routing rules are updated from a SDN OpenFlow controller. From SDVN, new vehicular architectures are introduced, for instance SD Vehicular Cloud (SDVC). In SDVC, vehicles are SDN devices that host virtualization technology for enabling deployment of cloud-based vehicular applications. In addition, the migration of Virtual Machines (VM) over SDVC challenges the performance of cloud-based vehicular applications due the highly mobility of vehicles. However, the current literature that discusses VM migration in SDVC is very limited. In this paper, we first analyze the evolution of computation and networking technologies of SDVC with a focus on its architecture within the cloud-based vehicular environment. Then, we discuss the potential cloud-based vehicular applications assisted by the SDVC along with its ability to manage several VM migration scenarios. Lastly, we provide a detailed comparison of existing frameworks in SDVC that integrate the VM migration approach and different emulators or simulators network used to evaluate VM frameworks' use cases.

RevDate: 2020-02-21

Ali M, Sadeghi MR, X Liu (2020)

Lightweight Fine-Grained Access Control for Wireless Body Area Networks.

Sensors (Basel, Switzerland), 20(4): pii:s20041088.

Wireless Body Area Network (WBAN) is a highly promising technology enabling health providers to remotely monitor vital parameters of patients via tiny wearable and implantable sensors. In a WBAN, medical data is collected by several tiny sensors and usually transmitted to a server-side (e.g., a cloud service provider) for long-term storage and online/offline processing. However, as the health data includes several sensitive information, providing confidentiality and fine-grained access control is necessary to preserve the privacy of patients. In this paper, we design an attribute-based encryption (ABE) scheme with lightweight encryption and decryption mechanisms. Our scheme enables tiny sensors to encrypt the collected data under an access control policy by performing very few computational operations. Also, the computational overhead on the users in the decryption phase is lightweight, and most of the operations are performed by the cloud server. In comparison with some excellent ABE schemes, our encryption mechanism is more than 100 times faster, and the communication overhead in our scheme decreases significantly. We provide the security definition for the new primitive and prove its security in the standard model and under the hardness assumption of the decisional bilinear Diffie-Hellman (DBDH) problem.

RevDate: 2020-02-19

Han L, Zheng T, Zhu Y, et al (2020)

Live Semantic 3D Perception for Immersive Augmented Reality.

IEEE transactions on visualization and computer graphics [Epub ahead of print].

Semantic understanding of 3D environments is critical for both the unmanned system and the human involved virtual/augmented reality (VR/AR) immersive experience. Spatially-sparse convolution, taking advantage of the intrinsic sparsity of 3D point cloud data, makes high resolution 3D convolutional neural networks tractable with state-of-the-art results on 3D semantic segmentation problems. However, the exhaustive computations limits the practical usage of semantic 3D perception for VR/AR applications in portable devices. In this paper, we identify that the efficiency bottleneck lies in the unorganized memory access of the sparse convolution steps, i.e., the points are stored independently based on a predefined dictionary, which is inefficient due to the limited memory bandwidth of parallel computing devices (GPU). With the insight that points are continuous as 2D surfaces in 3D space, a chunk-based sparse convolution scheme is proposed to reuse the neighboring points within each spatially organized chunk. An efficient multi-layer adaptive fusion module is further proposed for employing the spatial consistency cue of 3D data to further reduce the computational burden. Quantitative experiments on public datasets demonstrate that our approach works 11° faster than previous approaches with competitive accuracy. By implementing both semantic and geometric 3D reconstruction simultaneously on a portable tablet device, we demo a foundation platform for immersive AR applications.

RevDate: 2020-02-12

Marah BD, Jing Z, Ma T, et al (2020)

Smartphone Architecture for Edge-Centric IoT Analytics.

Sensors (Basel, Switzerland), 20(3): pii:s20030892.

The current baseline architectures in the field of the Internet of Things (IoT) strongly recommends the use of edge computing in the design of the solution applications instead of the traditional approach which solely uses the cloud/core for analysis and data storage. This research, therefore, focuses on formulating an edge-centric IoT architecture for smartphones which are very popular electronic devices that are capable of executing complex computational tasks at the network edge. A novel smartphone IoT architecture (SMIoT) is introduced that supports data capture and preprocessing, model (i.e., machine learning models) deployment, model evaluation and model updating tasks. Moreover, a novel model evaluation and updating scheme is provided which ensures model validation in real-time. This ensures a sustainable and reliable model at the network edge that automatically adjusts to changes in the IoT data subspace. Finally, the proposed architecture is tested and evaluated using an IoT use case.

RevDate: 2020-02-11

Chattopadhyay A, TP Lu (2019)

Gene-gene interaction: the curse of dimensionality.

Annals of translational medicine, 7(24):813.

Identified genetic variants from genome wide association studies frequently show only modest effects on the disease risk, leading to the "missing heritability" problem. An avenue, to account for a part of this "missingness" is to evaluate gene-gene interactions (epistasis) thereby elucidating their effect on complex diseases. This can potentially help with identifying gene functions, pathways, and drug targets. However, the exhaustive evaluation of all possible genetic interactions among millions of single nucleotide polymorphisms (SNPs) raises several issues, otherwise known as the "curse of dimensionality". The dimensionality involved in the epistatic analysis of such exponentially growing SNPs diminishes the usefulness of traditional, parametric statistical methods. With the immense popularity of multifactor dimensionality reduction (MDR), a non-parametric method, proposed in 2001, that classifies multi-dimensional genotypes into one- dimensional binary approaches, led to the emergence of a fast-growing collection of methods that were based on the MDR approach. Moreover, machine-learning (ML) methods such as random forests and neural networks (NNs), deep-learning (DL) approaches, and hybrid approaches have also been applied profusely, in the recent years, to tackle this dimensionality issue associated with whole genome gene-gene interaction studies. However, exhaustive searching in MDR based approaches or variable selection in ML methods, still pose the risk of missing out on relevant SNPs. Furthermore, interpretability issues are a major hindrance for DL methods. To minimize this loss of information, Python based tools such as PySpark can potentially take advantage of distributed computing resources in the cloud, to bring back smaller subsets of data for further local analysis. Parallel computing can be a powerful resource that stands to fight this "curse". PySpark supports all standard Python libraries and C extensions thus making it convenient to write codes to deliver dramatic improvements in processing speed for extraordinarily large sets of data.

RevDate: 2020-02-10

Peri S, Roberts S, Kreko IR, et al (2019)

Read Mapping and Transcript Assembly: A Scalable and High-Throughput Workflow for the Processing and Analysis of Ribonucleic Acid Sequencing Data.

Frontiers in genetics, 10:1361.

Next-generation RNA-sequencing is an incredibly powerful means of generating a snapshot of the transcriptomic state within a cell, tissue, or whole organism. As the questions addressed by RNA-sequencing (RNA-seq) become both more complex and greater in number, there is a need to simplify RNA-seq processing workflows, make them more efficient and interoperable, and capable of handling both large and small datasets. This is especially important for researchers who need to process hundreds to tens of thousands of RNA-seq datasets. To address these needs, we have developed a scalable, user-friendly, and easily deployable analysis suite called RMTA (Read Mapping, Transcript Assembly). RMTA can easily process thousands of RNA-seq datasets with features that include automated read quality analysis, filters for lowly expressed transcripts, and read counting for differential expression analysis. RMTA is containerized using Docker for easy deployment within any compute environment [cloud, local, or high-performance computing (HPC)] and is available as two apps in CyVerse's Discovery Environment, one for normal use and one specifically designed for introducing undergraduates and high school to RNA-seq analysis. For extremely large datasets (tens of thousands of FASTq files) we developed a high-throughput, scalable, and parallelized version of RMTA optimized for launching on the Open Science Grid (OSG) from within the Discovery Environment. OSG-RMTA allows users to utilize the Discovery Environment for data management, parallelization, and submitting jobs to OSG, and finally, employ the OSG for distributed, high throughput computing. Alternatively, OSG-RMTA can be run directly on the OSG through the command line. RMTA is designed to be useful for data scientists, of any skill level, interested in rapidly and reproducibly analyzing their large RNA-seq data sets.

RevDate: 2020-02-08

Ramírez-Faz J, Fernández-Ahumada LM, Fernández-Ahumada E, et al (2020)

Monitoring of Temperature in Retail Refrigerated Cabinets Applying IoT Over Open-Source Hardware and Software.

Sensors (Basel, Switzerland), 20(3): pii:s20030846.

The control of refrigeration in the food chain is fundamental at all stages, with special emphasis on the retail stage. The implementation of information and communication technologies (IoT, open-source hardware and software, cloud computing, etc.) is representing a revolution in the operational paradigm of food control. This paper presents a low-cost IoT solution, based on free hardware and software, for monitoring the temperature in refrigerated retail cabinets. Specifically, the use of the ESP-8266-Wi-Fi microcontroller with DS18B20 temperature sensors is proposed. The ThingSpeak IoT platform is used to store and process data in the cloud. The solution presented is robust, affordable, and flexible, allowing to extend the scope of supervising other relevant parameters in the operating process (light control, energy efficiency, consumer presence, etc.).

RevDate: 2020-02-06

Xu J, Yang S, Lu W, et al (2020)

Incentivizing for Truth Discovery in Edge-assisted Large-scale Mobile Crowdsensing.

Sensors (Basel, Switzerland), 20(3): pii:s20030805.

The recent development of human-carried mobile devices has promoted the great development of mobile crowdsensing systems. Most existing mobile crowdsensing systems depend on the crowdsensing service of the deep cloud. With the increasing scale and complexity, there is a tendency to enhance mobile crowdsensing with the edge computing paradigm to reduce latency and computational complexity, and improve the expandability and security. In this paper, we propose an integrated solution to stimulate the strategic users to contribute more for truth discovery in the edge-assisted mobile crowdsensing. We design an incentive mechanism consisting of truth discovery stage and budget feasible reverse auction stage. In truth discovery stage, we estimate the truth for each task in both deep cloud and edge cloud. In budget feasible reverse auction stage, we design a greedy algorithm to select the winners to maximize the quality function under the budget constraint. Through extensive simulations, we demonstrate that the proposed mechanism is computationally efficient, individually rational, truthful, budget feasible and constant approximate. Moreover, the proposed mechanism shows great superiority in terms of estimation precision and expandability.

RevDate: 2020-02-08

Sazib N, Mladenova I, J Bolten (2018)

Leveraging Google Earth Engine for Drought Assessment using Global Soil Moisture Data.

Remote sensing, 10(8):.

Soil moisture is considered a key variable to assess crop and drought conditions. However, readily available soil moisture datasets developed for monitoring agricultural drought conditions are uncommon. The aim of this work is to examine two global soil moisture data sets and a set of soil moisture web-based processing tools developed to demonstrate the value of the soil moisture data for drought monitoring and crop forecasting using Google Earth Engine (GEE). The two global soil moisture data sets discussed in the paper are generated by integrating Soil Moisture Ocean Salinity (SMOS) and Soil Moisture Active Passive (SMAP) satellite-derived observations into the modified two-layer Palmer model using a 1-D Ensemble Kalman Filter (EnKF) data assimilation approach. The web-based tools are designed to explore soil moisture variability as a function of land cover change and to easily estimate drought characteristics such as drought duration and intensity using soil moisture anomalies, and to inter-compare them against alternative drought indicators. To demonstrate the utility of these tools for agricultural drought monitoring, the soil moisture products, vegetation- and precipitation-based products are assessed over drought prone regions in South Africa and Ethiopia. Overall, the 3-month scale Standardized Precipitation Index (SPI) and Normalized Vegetation Index (NDVI) showed higher agreement with the root zone soil moisture anomalies. Soil moisture anomalies exhibited lower drought duration but higher intensity compare to SPIs. Inclusion of the global soil moisture data into GEE data catalog and the development of the web-based tools described in the paper enable a vast diversity of users to quickly and easily assess the impact of drought and improve planning related to drought risk assessment and early warning. GEE also improves the accessibility and usability of the earth observation data and related tools by making them available to a wide range of researchers and the public. In particular, the cloud-based nature of GEE is useful for providing access to the soil moisture data and scripts to users in developing countries that lack adequate observational soil moisture data or the necessary computational resources required to develop them.

RevDate: 2020-01-31

Hettige S, Dasanayaka E, DS Ediriweera (2020)

Usage of cloud storage facilities by medical students in a low-middle income country, Sri Lanka: a cross sectional study.

BMC medical informatics and decision making, 20(1):10.

BACKGROUND: Cloud storage facilities (CSF) has become popular among the internet users. There is limited data on CSF usage among university students in low middle-income countries including Sri Lanka. In this study we present the CSF usage among medical students at the Faculty of Medicine, University of Kelaniya.

METHODS: We undertook a cross sectional study at the Faculty of Medicine, University of Kelaniya, Sri Lanka. Stratified random sampling was used to recruit students representing all the batches. A self-administrated questionnaire was given.

RESULTS: Of 261 (90.9%) respondents, 181 (69.3%) were females. CSF awareness was 56.5% (95%CI: 50.3-62.6%) and CSF usage was 50.8% (95%CI: 44.4-57.2%). Awareness was higher in males (P = 0.003) and was low in senior students. Of CSF aware students, 85% knew about Google Drive and 70.6% used it. 73.6 and 42.1% knew about Dropbox and OneDrive. 50.0 and 22.0% used them respectively. There was no association between CSF awareness and pre-university entrance or undergraduate examination performance. Inadequate knowledge, time, accessibility, security and privacy concerns limited CSF usage. 69.8% indicated that they would like to undergo training on CSF as an effective tool for education.

CONCLUSION: CSF awareness and usage among the students were 56.5 and 50.8%. Google drive is the most popular CSF. Lack of knowledge, accessibility, concerns on security and privacy limited CSF usage among students. Majority were interested to undergo training on CSF and undergraduate Information Communication Technology (ICT) curricula should introduce CSF as effective educational tools.

RevDate: 2020-01-30

Alghazo JM (2019)

Intelligent Security and Privacy of Electronic Health Records Using Biometric Images.

Current medical imaging reviews, 15(4):386-394.

BACKGROUND: In the presence of Cloud Environment and the migration of Electronic Health Systems and records to the Cloud, patient privacy has become an emergent problem for healthcare institutions. Government bylaws, electronic health documentation, and innovative internet health services generate numerous security issues for healthcare conformity and information security groups. To deal with these issues, healthcare institutes must protect essential IT infrastructure from unauthorized use by insiders and hackers. The Cloud Computing archetype allows for EHealth methods that improve the features and functionality of systems on the cloud. On the other hand, sending patients' medical information and records to the Cloud entails a number of risks in the protection and privacy of the health records during the communication process.

AIM: In this paper, a solution is proposed for the security of Electronic Health Records (EHRs) in cloud environment during the process of sending the data to the cloud. In addition, the proposed method uses biometric images that allow for unified patient identification across cloud-based EHRs and across medical institutions.

METHOD: To protect the privacy of patients' information and streamline the migration process, a watermarking-based method is proposed for health care providers to ensure that patients' data are only accessible to authorized personnel. Patients' information, such as name, id, symptoms, diseases, and previous history, is secured in biometric images of patients as an encrypted watermark.

RESULTS: Quality and impeccability analysis and robustness were performed to test the proposed method. The PSNR values show that the proposed method produced excellent results.

CONCLUSION: The robustness and impressibility of the proposed method were tested by subjecting the watermarked images to different simulated attacks. The watermarks were largely impermeable to varied and repeated attacks.

RevDate: 2020-01-25

Shao M, Zhou Z, Bin G, et al (2020)

A Wearable Electrocardiogram Telemonitoring System for Atrial Fibrillation Detection.

Sensors (Basel, Switzerland), 20(3): pii:s20030606.

In this paper we proposed a wearable electrocardiogram (ECG) telemonitoring system for atrial fibrillation (AF) detection based on a smartphone and cloud computing. A wearable ECG patch was designed to collect ECG signals and send the signals to an Android smartphone via Bluetooth. An Android APP was developed to display the ECG waveforms in real time and transmit every 30 s ECG data to a remote cloud server. A machine learning (CatBoost)-based ECG classification method was proposed to detect AF in the cloud server. In case of detected AF, the cloud server pushed the ECG data and classification results to the web browser of a doctor. Finally, the Android APP displayed the doctor's diagnosis for the ECG signals. Experimental results showed the proposed CatBoost classifier trained with 17 selected features achieved an overall F1 score of 0.92 on the test set (n = 7,270). The proposed wearable ECG monitoring system may potentially be useful for long-term ECG telemonitoring for AF detection.

RevDate: 2020-01-29
CmpDate: 2020-01-29

Vanus J, Fiedorova K, Kubicek J, et al (2020)

Wavelet-Based Filtration Procedure for Denoising the Predicted CO2 Waveforms in Smart Home within the Internet of Things.

Sensors (Basel, Switzerland), 20(3): pii:s20030620.

The operating cost minimization of smart homes can be achieved with the optimization of the management of the building's technical functions by determination of the current occupancy status of the individual monitored spaces of a smart home. To respect the privacy of the smart home residents, indirect methods (without using cameras and microphones) are possible for occupancy recognition of space in smart homes. This article describes a newly proposed indirect method to increase the accuracy of the occupancy recognition of monitored spaces of smart homes. The proposed procedure uses the prediction of the course of CO2 concentration from operationally measured quantities (temperature indoor and relative humidity indoor) using artificial neural networks with a multilayer perceptron algorithm. The mathematical wavelet transformation method is used for additive noise canceling from the predicted course of the CO2 concentration signal with an objective increase accuracy of the prediction. The calculated accuracy of CO2 concentration waveform prediction in the additive noise-canceling application was higher than 98% in selected experiments.

RevDate: 2020-01-29
CmpDate: 2020-01-29

Wei H, Luo H, Y Sun (2020)

Mobility-Aware Service Caching in Mobile Edge Computing for Internet of Things.

Sensors (Basel, Switzerland), 20(3): pii:s20030610.

The mobile edge computing architecture successfully solves the problem of high latency in cloud computing. However, current research focuses on computation offloading and lacks research on service caching issues. To solve the service caching problem, especially for scenarios with high mobility in the Sensor Networks environment, we study the mobility-aware service caching mechanism. Our goal is to maximize the number of users who are served by the local edge-cloud, and we need to make predictions about the user's target location to avoid invalid service requests. First, we propose an idealized geometric model to predict the target area of a user's movement. Since it is difficult to obtain all the data needed by the model in practical applications, we use frequent patterns to mine local moving track information. Then, by using the results of the trajectory data mining and the proposed geometric model, we make predictions about the user's target location. Based on the prediction result and existing service cache, the service request is forwarded to the appropriate base station through the service allocation algorithm. Finally, to be able to train and predict the most popular services online, we propose a service cache selection algorithm based on back-propagation (BP) neural network. The simulation experiments show that our service cache algorithm reduces the service response time by about 13.21% on average compared to other algorithms, and increases the local service proportion by about 15.19% on average compared to the algorithm without mobility prediction.

RevDate: 2020-02-05

Blatti C, Emad A, Berry MJ, et al (2020)

Knowledge-guided analysis of "omics" data using the KnowEnG cloud platform.

PLoS biology, 18(1):e3000583.

We present Knowledge Engine for Genomics (KnowEnG), a free-to-use computational system for analysis of genomics data sets, designed to accelerate biomedical discovery. It includes tools for popular bioinformatics tasks such as gene prioritization, sample clustering, gene set analysis, and expression signature analysis. The system specializes in "knowledge-guided" data mining and machine learning algorithms, in which user-provided data are analyzed in light of prior information about genes, aggregated from numerous knowledge bases and encoded in a massive "Knowledge Network." KnowEnG adheres to "FAIR" principles (findable, accessible, interoperable, and reuseable): its tools are easily portable to diverse computing environments, run on the cloud for scalable and cost-effective execution, and are interoperable with other computing platforms. The analysis tools are made available through multiple access modes, including a web portal with specialized visualization modules. We demonstrate the KnowEnG system's potential value in democratization of advanced tools for the modern genomics era through several case studies that use its tools to recreate and expand upon the published analysis of cancer data sets.

RevDate: 2020-01-24
CmpDate: 2020-01-24

Moleda M, Momot A, D Mrozek (2020)

Predictive Maintenance of Boiler Feed Water Pumps Using SCADA Data.

Sensors (Basel, Switzerland), 20(2): pii:s20020571.

IoT enabled predictive maintenance allows companies in the energy sector to identify potential problems in the production devices far before the failure occurs. In this paper, we propose a method for early detection of faults in boiler feed pumps using existing measurements currently captured by control devices. In the experimental part, we work on real measurement data and events from a coal fired power plant. The main research objective is to implement a model that detects deviations from the normal operation state based on regression and to check which events or failures can be detected by it. The presented technique allows the creation of a predictive system working on the basis of the available data with a minimal requirement of expert knowledge, in particular the knowledge related to the categorization of failures and the exact time of their occurrence, which is sometimes difficult to identify. The paper shows that with modern technologies, such as the Internet of Things, big data, and cloud computing, it is possible to integrate automation systems, designed in the past only to control the production process, with IT systems that make all processes more efficient through the use of advanced analytic tools.

RevDate: 2020-01-24
CmpDate: 2020-01-24

Ming Y, X Yu (2020)

Efficient Privacy-Preserving Data Sharing for Fog-Assisted Vehicular Sensor Networks.

Sensors (Basel, Switzerland), 20(2): pii:s20020514.

Vehicular sensor networks (VSNs) have emerged as a paradigm for improving traffic safety in urban cities. However, there are still several issues with VSNs. Vehicles equipped with sensing devices usually upload large amounts of data reports to a remote cloud center for processing and analyzing, causing heavy computation and communication costs. Additionally, to choose an optimal route, it is required for vehicles to query the remote cloud center to obtain road conditions of the potential moving route, leading to an increased communication delay and leakage of location privacy. To solve these problems, this paper proposes an efficient privacy-preserving data sharing (EP 2 DS) scheme for fog-assisted vehicular sensor networks. Specifically, the proposed scheme utilizes fog computing to provide local data sharing with low latency; furthermore, it exploits a super-increasing sequence to format the sensing data of different road segments into one report, thus saving on the resources of communication and computation. In addition, using the modified oblivious transfer technology, the proposed scheme can query the road conditions of the potential moving route without disclosing the query location. Finally, an analysis of security suggests that the proposed scheme can satisfy all the requirements for security and privacy, with the evaluation results indicating that the proposed scheme leads to low costs in computation and communication.

RevDate: 2020-01-23

Pittard WS, Villaveces CK, S Li (2020)

A Bioinformatics Primer to Data Science, with Examples for Metabolomics.

Methods in molecular biology (Clifton, N.J.), 2104:245-263.

With the increasing importance of big data in biomedicine, skills in data science are a foundation for the individual career development and for the progress of science. This chapter is a practical guide to working with high-throughput biomedical data. It covers how to understand and set up the computing environment, to start a research project with proper and effective data management, and to perform common bioinformatics tasks such as data wrangling, quality control, statistical analysis, and visualization, with examples on metabolomics data. Concepts and tools related to coding and scripting are discussed. Version control, knitr and Jupyter notebooks are important to project management, collaboration, and research reproducibility. Overall, this chapter describes a core set of skills to work in bioinformatics, and can serve as a reference text at the level of a graduate course and interfacing with data science.

RevDate: 2020-01-17

Nguyen DC, Nguyen KD, PN Pathirana (2019)

A Mobile Cloud based IoMT Framework for Automated Health Assessment and Management.

Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference, 2019:6517-6520.

In recent years, there has been growing interest in the use of mobile cloud and Internet of Medical Things (IoMT) in automated diagnosis and health monitoring. These applications play a significant role in providing smart medical services in modern healthcare systems. In this paper, we deploy a mobile cloud-based IoMT scheme to monitor the progression of a neurological disorder using a test of motor coordination. The computing and storage capabilities of cloud server is employed to facilitate the estimation of the severity levels given by an established quantitative assessment. An Android application is used for data acquisition and communication with the cloud. Further, we integrate the proposed system with a data sharing framework in a blockchain network as an innovative solution that allows reliable data exchange among healthcare users. The experimental results show the feasibility of implementing the proposed system in a wide range of healthcare applications.

RevDate: 2020-01-17

Ellis CA, Gu P, Sendi MSE, et al (2019)

A Cloud-based Framework for Implementing Portable Machine Learning Pipelines for Neural Data Analysis.

Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference, 2019:4466-4469.

Cloud-based computing has created new avenues for innovative research. In recent years, numerous cloud-based, data analysis projects within the biomedical domain have been implemented. As this field is likely to grow, there is a need for a unified platform for the developing and testing of advanced analytic and modeling tools that enables those tools to be easily reused for biomedical data analysis by a broad set of users with diverse technical skills. A cloud-based platform of this nature could greatly assist future research endeavors. In this paper, we take the first step towards building such a platform. We define an approach by which containerized analytic pipelines can be distributed for use on cloud-based or on-premise computing platforms. We demonstrate our approach by implementing a portable biomarker identification pipeline using a logistic regression model with elastic net regularization (LR-ENR) and running it on Google Cloud. We used this pipeline for the diagnosis of Parkinson's disease based on a combination of clinical, demographic, and MRI-based features and for the identification of the most predictive biomarkers.

RevDate: 2020-01-27

Vitkin E, Gillis A, Polikovsky M, et al (2020)

Distributed flux balance analysis simulations of serial biomass fermentation by two organisms.

PloS one, 15(1):e0227363.

Intelligent biorefinery design that addresses both the composition of the biomass feedstock as well as fermentation microorganisms could benefit from dedicated tools for computational simulation and computer-assisted optimization. Here we present the BioLego Vn2.0 framework, based on Microsoft Azure Cloud, which supports large-scale simulations of biomass serial fermentation processes by two different organisms. BioLego enables the simultaneous analysis of multiple fermentation scenarios and the comparison of fermentation potential of multiple feedstock compositions. Thanks to the effective use of cloud computing it further allows resource intensive analysis and exploration of media and organism modifications. We use BioLego to obtain biological and validation results, including (1) exploratory search for the optimal utilization of corn biomasses-corn cobs, corn fiber and corn stover-in fermentation biorefineries; (2) analysis of the possible effects of changes in the composition of K. alvarezi biomass on the ethanol production yield in an anaerobic two-step process (S. cerevisiae followed by E. coli); (3) analysis of the impact, on the estimated ethanol production yield, of knocking out single organism reactions either in one or in both organisms in an anaerobic two-step fermentation process of Ulva sp. into ethanol (S. cerevisiae followed by E. coli); and (4) comparison of several experimentally measured ethanol fermentation rates with the predictions of BioLego.

RevDate: 2020-02-05

Reyna MA, Josef CS, Jeter R, et al (2020)

Early Prediction of Sepsis From Clinical Data: The PhysioNet/Computing in Cardiology Challenge 2019.

Critical care medicine, 48(2):210-217.

OBJECTIVES: Sepsis is a major public health concern with significant morbidity, mortality, and healthcare expenses. Early detection and antibiotic treatment of sepsis improve outcomes. However, although professional critical care societies have proposed new clinical criteria that aid sepsis recognition, the fundamental need for early detection and treatment remains unmet. In response, researchers have proposed algorithms for early sepsis detection, but directly comparing such methods has not been possible because of different patient cohorts, clinical variables and sepsis criteria, prediction tasks, evaluation metrics, and other differences. To address these issues, the PhysioNet/Computing in Cardiology Challenge 2019 facilitated the development of automated, open-source algorithms for the early detection of sepsis from clinical data.

DESIGN: Participants submitted containerized algorithms to a cloud-based testing environment, where we graded entries for their binary classification performance using a novel clinical utility-based evaluation metric. We designed this scoring function specifically for the Challenge to reward algorithms for early predictions and penalize them for late or missed predictions and for false alarms.

SETTING: ICUs in three separate hospital systems. We shared data from two systems publicly and sequestered data from all three systems for scoring.

PATIENTS: We sourced over 60,000 ICU patients with up to 40 clinical variables for each hour of a patient's ICU stay. We applied Sepsis-3 clinical criteria for sepsis onset.

INTERVENTIONS: None.

MEASUREMENTS AND MAIN RESULTS: A total of 104 groups from academia and industry participated, contributing 853 submissions. Furthermore, 90 abstracts based on Challenge entries were accepted for presentation at Computing in Cardiology.

CONCLUSIONS: Diverse computational approaches predict the onset of sepsis several hours before clinical recognition, but generalizability to different hospital systems remains a challenge.

RevDate: 2020-01-31

Jones M, DeRuyter F, J Morris (2020)

The Digital Health Revolution and People with Disabilities: Perspective from the United States.

International journal of environmental research and public health, 17(2): pii:ijerph17020381.

This article serves as the introduction to this special issue on Mobile Health and Mobile Rehabilitation for People with Disabilities. Social, technological and policy trends are reviewed. Needs, opportunities and challenges for the emerging fields of mobile health (mHealth, aka eHealth) and mobile rehabilitation (mRehab) are discussed. Healthcare in the United States (U.S.) is at a critical juncture characterized by: (1) a growing need for healthcare and rehabilitation services; (2) maturing technological capabilities to support more effective and efficient health services; (3) evolving public policies designed, by turns, to contain cost and support new models of care; and (4) a growing need to ensure acceptance and usability of new health technologies by people with disabilities and chronic conditions, clinicians and health delivery systems. Discussion of demographic and population health data, healthcare service delivery and a public policy primarily focuses on the U.S. However, trends identified (aging populations, growing prevalence of chronic conditions and disability, labor shortages in healthcare) apply to most countries with advanced economies and others. Furthermore, technologies that enable mRehab (wearable sensors, in-home environmental monitors, cloud computing, artificial intelligence) transcend national boundaries. Remote and mobile healthcare delivery is needed and inevitable. Proactive engagement is critical to ensure acceptance and effectiveness for all stakeholders.

RevDate: 2020-01-17

Kuzniar A, Maassen J, Verhoeven S, et al (2020)

sv-callers: a highly portable parallel workflow for structural variant detection in whole-genome sequence data.

PeerJ, 8:e8214.

Structural variants (SVs) are an important class of genetic variation implicated in a wide array of genetic diseases including cancer. Despite the advances in whole genome sequencing, comprehensive and accurate detection of SVs in short-read data still poses some practical and computational challenges. We present sv-callers, a highly portable workflow that enables parallel execution of multiple SV detection tools, as well as provide users with example analyses of detected SV callsets in a Jupyter Notebook. This workflow supports easy deployment of software dependencies, configuration and addition of new analysis tools. Moreover, porting it to different computing systems requires minimal effort. Finally, we demonstrate the utility of the workflow by performing both somatic and germline SV analyses on different high-performance computing systems.

RevDate: 2020-01-17

Masood A, Yang P, Sheng B, et al (2020)

Cloud-Based Automated Clinical Decision Support System for Detection and Diagnosis of Lung Cancer in Chest CT.

IEEE journal of translational engineering in health and medicine, 8:4300113.

Lung cancer is a major cause for cancer-related deaths. The detection of pulmonary cancer in the early stages can highly increase survival rate. Manual delineation of lung nodules by radiologists is a tedious task. We developed a novel computer-aided decision support system for lung nodule detection based on a 3D Deep Convolutional Neural Network (3DDCNN) for assisting the radiologists. Our decision support system provides a second opinion to the radiologists in lung cancer diagnostic decision making. In order to leverage 3-dimensional information from Computed Tomography (CT) scans, we applied median intensity projection and multi-Region Proposal Network (mRPN) for automatic selection of potential region-of-interests. Our Computer Aided Diagnosis (CAD) system has been trained and validated using LUNA16, ANODE09, and LIDC-IDR datasets; the experiments demonstrate the superior performance of our system, attaining sensitivity, specificity, AUROC, accuracy, of 98.4%, 92%, 96% and 98.51% with 2.1 FPs per scan. We integrated cloud computing, trained and validated our Cloud-Based 3DDCNN on the datasets provided by Shanghai Sixth People's Hospital, as well as LUNA16, ANODE09, and LIDC-IDR. Our system outperformed the state-of-the-art systems and obtained an impressive 98.7% sensitivity at 1.97 FPs per scan. This shows the potentials of deep learning, in combination with cloud computing, for accurate and efficient lung nodule detection via CT imaging, which could help doctors and radiologists in treating lung cancer patients.

RevDate: 2020-01-22

de Sousa C, Fatoyinbo L, Neigh C, et al (2020)

Cloud-computing and machine learning in support of country-level land cover and ecosystem extent mapping in Liberia and Gabon.

PloS one, 15(1):e0227438.

Liberia and Gabon joined the Gaborone Declaration for Sustainability in Africa (GDSA), established in 2012, with the goal of incorporating the value of nature into national decision making by estimating the multiple services obtained from ecosystems using the natural capital accounting framework. In this study, we produced 30-m resolution 10 classes land cover maps for the 2015 epoch for Liberia and Gabon using the Google Earth Engine (GEE) cloud platform to support the ongoing natural capital accounting efforts in these nations. We propose an integrated method of pixel-based classification using Landsat 8 data, the Random Forest (RF) classifier and ancillary data to produce high quality land cover products to fit a broad range of applications, including natural capital accounting. Our approach focuses on a pre-classification filtering (Masking Phase) based on spectral signature and ancillary data to reduce the number of pixels prone to be misclassified; therefore, increasing the quality of the final product. The proposed approach yields an overall accuracy of 83% and 81% for Liberia and Gabon, respectively, outperforming prior land cover products for these countries in both thematic content and accuracy. Our approach, while relatively simple and highly replicable, was able to produce high quality land cover products to fill an observational gap in up to date land cover data at national scale for Liberia and Gabon.

RevDate: 2020-01-09

Anderson K, Fawcett D, Cugulliere A, et al (2020)

Vegetation expansion in the subnival Hindu Kush Himalaya.

Global change biology [Epub ahead of print].

The mountain systems of the Hindu Kush Himalaya (HKH) are changing rapidly due to climatic change, but an overlooked component is the subnival ecosystem (between the treeline and snow line), characterized by short-stature plants and seasonal snow. Basic information about subnival vegetation distribution and rates of ecosystem change are not known, yet such information is needed to understand relationships between subnival ecology and water/carbon cycles. We show that HKH subnival ecosystems cover five to 15 times the area of permanent glaciers and snow, highlighting their eco-hydrological importance. Using satellite data from the Landsat 5, 7 and 8 missions, we measured change in the spatial extent of subnival vegetation from 1993 to 2018. The Landsat surface reflectance-derived Normalized Difference Vegetation Index product was thresholded at 0.1 to indicate the presence/absence of vegetation. Using this product, the strength and direction of time-series trends in the green pixel fraction were measured within three regions of interest. We controlled for cloud cover, snow cover and evaluated the impact of sensor radiometric differences between Landsat 7 and Landsat 8. Using Google Earth Engine to expedite data processing tasks, we show that there has been a weakly positive increase in the extent of subnival vegetation since 1993. Strongest and most significant trends were found in the height region of 5,000-5,500 m a.s.l. across the HKH extent: R2 = .302, Kendall's τ = 0.424, p < .05, but this varied regionally, with height, and according to the sensors included in the time series. Positive trends at lower elevations occurred on steeper slopes whilst at higher elevations, flatter areas exhibited stronger trends. We validated our findings using online photographs. Subnival ecological changes have likely impacted HKH carbon and water cycles with impacts on millions of people living downstream, but the strength and direction of impacts of vegetation expansion remain unknown.

RevDate: 2020-01-31

Cheng X, Chen F, Xie D, et al (2020)

Design of a Secure Medical Data Sharing Scheme Based on Blockchain.

Journal of medical systems, 44(2):52.

With the rapid development of technologies such as artificial intelligence, blockchain, cloud computing, and big data, Medical Cyber Physical Systems (MCPS) are increasingly demanding data security, while cloud storage solves the storage problem of complex medical data. However, it is difficult to realize data security sharing. The decentralization feature of blockchain is helpful to solve the problem that the secure authentication process is highly dependent on the trusted third party and implement data security transmission. In this paper, the blockchain technology is used to describe the security requirements in authentication process, and a network model of MCPS based on blockchain is proposed. Through analysis of medical data storage architecture, it can ensure that data can't be tampered and untrackable. In the security authentication phase, bilinear mapping and intractable problems can be used to solve the security threat in the authentication process of medical data providers and users. It can avoid the credibility problem of the trusted third party, and also can realize the ?thyc=10?>two-way authentication between the hospital and blockchain node. Then, BAN logic is used to analyze security protocols, and formal analysis and comparison of security protocols are also made. The results show that the MCPS based on blockchain not only realizes medical treatment data sharing, but also meet the various security requirements in the security authentication phase. In addition, the storage and computing overhead costs is ideal. Therefore, the proposed scheme is more suitable for secure sharing of medical big data.

RevDate: 2020-02-07

Rady A, Fischer J, Reeves S, et al (2019)

The Effect of Light Intensity, Sensor Height, and Spectral Pre-Processing Methods when using NIR Spectroscopy to Identify Different Allergen-Containing Powdered Foods.

Sensors (Basel, Switzerland), 20(1):.

: Food allergens present a significant health risk to the human population, so their presence must be monitored and controlled within food production environments. This is especially important for powdered food, which can contain nearly all known food allergens. Manufacturing is experiencing the fourth industrial revolution (Industry 4.0), which is the use of digital technologies, such as sensors, Internet of Things (IoT), artificial intelligence, and cloud computing, to improve the productivity, efficiency, and safety of manufacturing processes. This work studied the potential of small low-cost sensors and machine learning to identify different powdered foods which naturally contain allergens. The research utilised a near-infrared (NIR) sensor and measurements were performed on over 50 different powdered food materials. This work focussed on several measurement and data processing parameters, which must be determined when using these sensors. These included sensor light intensity, height between sensor and food sample, and the most suitable spectra pre-processing method. It was found that the K-nearest neighbour and linear discriminant analysis machine learning methods had the highest classification prediction accuracy for identifying samples containing allergens of all methods studied. The height between the sensor and the sample had a greater effect than the sensor light intensity and the classification models performed much better when the sensor was positioned closer to the sample with the highest light intensity. The spectra pre-processing methods, which had the largest positive impact on the classification prediction accuracy, were the standard normal variate (SNV) and multiplicative scattering correction (MSC) methods. It was found that with the optimal combination of sensor height, light intensity, and spectra pre-processing, a classification prediction accuracy of 100% could be achieved, making the technique suitable for use within production environments.

RevDate: 2020-02-07
CmpDate: 2020-01-07

Ren Y, Zhu F, Sharma PK, et al (2019)

Data Query Mechanism Based on Hash Computing Power of Blockchain in Internet of Things.

Sensors (Basel, Switzerland), 20(1):.

In the IoT (Internet of Things) environment, smart homes, smart grids, and telematics constantly generate data with complex attributes. These data have low heterogeneity and poor interoperability, which brings difficulties to data management and value mining. The promising combination of blockchain and the Internet of things as BCoT (blockchain of things) can solve these problems. This paper introduces an innovative method DCOMB (dual combination Bloom filter) to firstly convert the computational power of bitcoin mining into the computational power of query. Furthermore, this article uses the DCOMB method to build blockchain-based IoT data query model. DCOMB can implement queries only through mining hash calculation. This model combines the data stream of the IoT with the timestamp of the blockchain, improving the interoperability of data and the versatility of the IoT database system. The experiment results show that the random reading performance of DCOMB query is higher than that of COMB (combination Bloom filter), and the error rate of DCOMB is lower. Meanwhile, both DCOMB and COMB query performance are better than MySQL (My Structured Query Language).

RevDate: 2020-01-08

Smith T, Porter S, D Kovarik (2019)

Immuno-Biotechnology and Bioinformatics in Community Colleges.

Journal of biomolecular techniques : JBT, 30(Suppl):S20-S21.

Immuno-biotechnology is one of the fastest growing areas in the field of biotechnology. Digital World Biology's Biotech-Careers.org database of biotechnology employers (>6800) has nearly 700 organizations that are involved with immunology in some way. With the advent of next generation DNA sequencing, and other technologies, immuno-biotechnology has significantly increased the use of computing technologies to decipher the meaning of large datasets and predict interactions between immune receptors (antibodies / T-Cell receptors / MHC) and their targets. The use of new technologies like immune-profiling - where large numbers of immune receptors are sequenced en masse - and targeted cancer therapies - where researchers create, engineer, and grow modified T cells to attack tumors - are leading to job growth and demands for new skills and knowledge in biomanufacturing, quality systems, immuno-bioinformatics, and cancer biology. In response to these new demands, Shoreline Community College (Shoreline, WA) has begun developing an immuno-biotechnology certificate. Part of this certificate includes a five-week course (30 hours hands-on computer lab) on immuno-bioinformatics. The immuno-bioinformatics course includes exercises in immune profiling, vaccine development, and operating bioinformatics programs using a command line interface. In immune profiling, students explore T-cell receptor datasets from early stage breast cancer samples using Adaptive Biotechnologies (Seattle, WA) immunoSEQ Analyzer public server to learn how T-cells differ between normal tissue, blood, and tumors. Next, they use the IEDB (Immune Epitope Database) in conjunction with Molecule World (Digital World Biology) to predict antigens from sequences and verify the results to learn the differences between continuous and discontinuous epitopes that are recognized by T-cell receptors and antibodies. Finally, to get hands-on experience with bioinformatics programs, students will use cloud computing (CyVerse) and IgBLAST (NCBI) to explore data from an immune profiling experiment.

RevDate: 2020-01-08

Yang A, Kishore A, Phipps B, et al (2019)

Cloud accelerated alignment and assembly of full-length single-cell RNA-seq data using Falco.

BMC genomics, 20(Suppl 10):927.

BACKGROUND: Read alignment and transcript assembly are the core of RNA-seq analysis for transcript isoform discovery. Nonetheless, current tools are not designed to be scalable for analysis of full-length bulk or single cell RNA-seq (scRNA-seq) data. The previous version of our cloud-based tool Falco only focuses on RNA-seq read counting, but does not allow for more flexible steps such as alignment and read assembly.

RESULTS: The Falco framework can harness the parallel and distributed computing environment in modern cloud platforms to accelerate read alignment and transcript assembly of full-length bulk RNA-seq and scRNA-seq data. There are two new modes in Falco: alignment-only and transcript assembly. In the alignment-only mode, Falco can speed up the alignment process by 2.5-16.4x based on two public scRNA-seq datasets when compared to alignment on a highly optimised standalone computer. Furthermore, it also provides a 10x average speed-up compared to alignment using published cloud-enabled tool for read alignment, Rail-RNA. In the transcript assembly mode, Falco can speed up the transcript assembly process by 1.7-16.5x compared to performing transcript assembly on a highly optimised computer.

CONCLUSION: Falco is a significantly updated open source big data processing framework that enables scalable and accelerated alignment and assembly of full-length scRNA-seq data on the cloud. The source code can be found at https://github.com/VCCRI/Falco.

RevDate: 2020-01-17

Fu HP, Chang TS, Yeh HP, et al (2019)

Analysis of Factors Influencing Hospitals' Implementation of a Green E-Procurement System Using a Cloud Model.

International journal of environmental research and public health, 16(24):.

Currently, the green procurement activities of private hospitals in Taiwan follow the self-built green electronic-procurement (e-procurement) system. This requires professional personnel to take the time to regularly update the green specification and software and hardware of the e-procurement system, and the information system maintenance cost is high. In the case of a green e-procurement system crash, the efficiency of green procurement activities for hospitals is affected. If the green e-procurement can be moved to a convenient and trusty cloud computing model, this will enhance the efficiency of procurement activities and reduce the information maintenance cost for private hospitals. However, implementing a cloud model is an issue of technology innovation application and the technology-organization-environment (TOE) framework has been widely applied as the theoretical framework in technology innovation application. In addition, finding the weight of factors is a multi-criteria decision-making (MCDM) issue. Therefore, the present study first collected factors influencing implementation of the cloud mode together with the TOE as the theoretical framework, by reviewing the literature. Therefore, an expert questionnaire was designed and distributed to top managers of 20 private hospitals in southern Taiwan. The fuzzy analysis hierarchical process (FAHP), which is a MCDM tool, finds the weights of the factors influencing private hospitals in southern Taiwan when they implement a cloud green e-procurement system. The research results can enable private hospitals to successfully implement a green e-procurement system through a cloud model by optimizing resource allocation according to the weight of each factor. In addition, the results of this research can help cloud service providers of green e-procurement understand users' needs and develop relevant cloud solutions and marketing strategies.

RevDate: 2020-01-08

Ounit R, Mason C, Lonardi S, et al (2019)

A Metagenomic Analysis of Environmental and Clinical Samples Using a Secure Hybrid Cloud Solution.

Journal of biomolecular techniques : JBT, 30(Suppl):S2.

The number and types of studies about the human microbiome, metagenomics and personalized medicine, and clinical genomics are increasing at an unprecedented rate, leading to computational challenges. For example, the analysis of patient/clinical samples requires methods capable of (i) accurately detecting pathogenic organisms, (ii) running with high speed to allow short response-time and diagnosis, and (iii) scaling to ever growing databases of reference genomes. While cloud-computing has the potential to offer low-cost solutions to these needs, serious concerns regarding the protection of genomic data exist due to the lack of control and security in remote genomic databases. We present a novel metagenomic analysis system called "Virgile" that is capable of performing privacy-preserving queries on databases hosted in outsourced servers (e.g., public or cloud-based). This method takes as input the sequenced data produced by any modern sequencing instruments (e.g., Illumina, Pacbio, Oxford Nanopore) and outputs the microbial profile using a database of whole genome sequences (e.g., the RefSeq database from NCBI). The algorithm for the microbial profile aims to estimate without bias the abundance of microorganisms present using a genome-centric approach. Result: Using an extensive set of 65 simulated datasets, negative and positive controls, real clinical samples, and mock communities, we show that Virgile identifies and estimates the abundance of organisms present in environmental or clinical samples with high accuracy compared to state-of-the-art and popular methods available, including MetaPhlAn2 and KrakenUniq. Running at high speed, Virgile can also be run on a standard 8 GB RAM laptop. Virgile is a novel privacy-preserving abundance estimation algorithm called Virgile that can efficiently and rapidly discern the abundance and taxonomic identification of organisms present in a metagenomic sample, including those from medical environments. To the best of our knowledge, Virgile is the only metagenome analysis system leveraging cloud computing in a secure manner.

RevDate: 2020-02-07
CmpDate: 2019-12-30

Luo Y, Li W, S Qiu (2019)

Anomaly Detection Based Latency-Aware Energy Consumption Optimization For IoT Data-Flow Services.

Sensors (Basel, Switzerland), 20(1):.

The continuous data-flow application in the IoT integrates the functions of fog, edge, and cloud computing. Its typical paradigm is the E-Health system. Like other IoT applications, the energy consumption optimization of IoT devices in continuous data-flow applications is a challenging problem. Since the anomalous nodes in the network will cause the increase of energy consumption, it is necessary to make continuous data flows bypass these nodes as much as possible. At present, the existing research work related to the performance of continuous data-flow is often optimized from system architecture design and deployment. In this paper, a mathematical programming method is proposed for the first time to optimize the runtime performance of continuous data flow applications. A lightweight anomaly detection method is proposed to evaluate the reliability of nodes. Then the node reliability is input into the optimization algorithm to estimate the task latency. The latency-aware energy consumption optimization for continuous data-flow is modeled as a mixed integer nonlinear programming problem. A block coordinate descend-based max-flow algorithm is proposed to solve this problem. Based on the real-life datasets, the numerical simulation is carried out. The simulation results show that the proposed strategy has better performance than the benchmark strategy.

RevDate: 2020-02-07
CmpDate: 2019-12-30

Zyrianoff I, Heideker A, Silva D, et al (2019)

Architecting and Deploying IoT Smart Applications: A Performance-Oriented Approach.

Sensors (Basel, Switzerland), 20(1):.

Layered internet of things (IoT) architectures have been proposed over the last years as they facilitate understanding the roles of different networking, hardware, and software components of smart applications. These are inherently distributed, spanning from devices installed in the field up to a cloud datacenter and further to a user smartphone, passing by intermediary stages at different levels of fog computing infrastructure. However, IoT architectures provide almost no hints on where components should be deployed. IoT Software Platforms derived from the layered architectures are expected to adapt to scenarios with different characteristics, requirements, and constraints from stakeholders and applications. In such a complex environment, a one-size-fits-all approach does not adapt well to varying demands and may hinder the adoption of IoT Smart Applications. In this paper, we propose a 5-layer IoT Architecture and a 5-stage IoT Computing Continuum, as well as provide insights on the mapping of software components of the former into physical locations of the latter. Also, we conduct a performance analysis study with six configurations where components are deployed into different stages. Our results show that different deployment configurations of layered components into staged locations generate bottlenecks that affect system performance and scalability. Based on that, policies for static deployment and dynamic migration of layered components into staged locations can be identified.

RevDate: 2019-12-24

Feng L, Zhou L, Gupta A, et al (2019)

Solving Generalized Vehicle Routing Problem With Occasional Drivers via Evolutionary Multitasking.

IEEE transactions on cybernetics [Epub ahead of print].

With the emergence of crowdshipping and sharing economy, vehicle routing problem with occasional drivers (VRPOD) has been recently proposed to involve occasional drivers with private vehicles for the delivery of goods. In this article, we present a generalized variant of VRPOD, namely, the vehicle routing problem with heterogeneous capacity, time window, and occasional driver (VRPHTO), by taking the capacity heterogeneity and time window of vehicles into consideration. Furthermore, to meet the requirement in today's cloud computing service, wherein multiple optimization tasks may need to be solved at the same time, we propose a novel evolutionary multitasking algorithm (EMA) to optimize multiple VRPHTOs simultaneously with a single population. Finally, 56 new VRPHTO instances are generated based on the existing common vehicle routing benchmarks. Comprehensive empirical studies are conducted to illustrate the benefits of the new VRPHTOs and to verify the efficacy of the proposed EMA for multitasking against a state-of-art single task evolutionary solver. The obtained results showed that the employment of occasional drivers could significantly reduce the routing cost, and the proposed EMA is not only able to solve multiple VRPHTOs simultaneously but also can achieve enhanced optimization performance via the knowledge transfer between tasks along the evolutionary search process.

RevDate: 2020-01-08

Peeler EJ, I Ernst (2019)

A new approach to the management of emerging diseases of aquatic animals.

Revue scientifique et technique (International Office of Epizootics), 38(2):537-551.

Since 1970, aquaculture has grown at a rate of between 5% and 10% per annum. It has achieved this by expanding into new areas, farming new (often non-native) species and intensifying production. These features of aquaculture, combined with large-scale movements of animals, have driven disease emergence, with negative consequences for both production and biodiversity. Efforts to improve the management of emerging diseases of aquatic animals must include actions to reduce the rate of disease emergence, enhance disease detection and reporting, and improve responses to prevent disease spread. The rate of disease emergence can be reduced by understanding the underpinning mechanisms and developing measures to mitigate them. The three principal mechanisms of disease emergence, namely, host switching, decreased host immunocompetence and increased pathogen virulence, have many drivers. The most important of these drivers are those that expose susceptible hosts to novel pathogens (e.g. the introduction of non-native hosts, translocation of pathogens, and increased interaction between wild and farmed populations), followed by host switching. Exposure to wild populations can be reduced through infrastructure and management measures to reduce escapes or exclude wild animals (e.g. barrier nets, filtration and closed-confinement technology). A high standard of health management ensures immunocompetence and resistance to putative new pathogens and strains, and thus reduces the rate of emergence. Appropriate site selection and husbandry can reduce the likelihood of pathogens developing increased virulence by preventing their continuous cycling in geographically or temporally linked populations. The under-reporting of emerging aquatic animal diseases constrains appropriate investigation and timely response. At the producer level, employing information and communications technology (e.g. smartphone applications and Cloud computing) to collect and manage data, coupled with a farmer-centric approach to surveillance, could improve reporting. In addition, reporting behaviours must be understood and disincentives mitigated. At the international level, improving the reporting of emerging diseases to the World Organisation for Animal Health allows Member Countries to implement appropriate measures to reduce transboundary spread. Reporting would be incentivised if the global response included the provision of support to low-income countries to, in the short term, control a reported emerging disease, and, in the longer term, develop aquatic animal health services. Early detection and reporting of emerging diseases are only of benefit if Competent Authorities' responses prevent disease spread. Effective responses to emerging diseases are challenging because basic information and tools are often lacking. Consequently, responses are likely to be sub-optimal unless contingency plans have been developed and tested, and decision-making arrangements have been well established.

RevDate: 2020-01-14
CmpDate: 2020-01-14

Yan H, Lu H, X Zhang (2019)

[Research on Risk Control of Hysteroscopy and Laparoscopy Based on Cloud Computing].

Zhongguo yi liao qi xie za zhi = Chinese journal of medical instrumentation, 43(6):459-461.

This paper discusses the use of Medatc System for the inspection and failure statistics of hysteroscopy and laparoscopy equipment. We add up our hospital one year of hysteroscopy and laparoscopy repair failure about 200 cases, more than 20 cases of patrol inspection. The equipment is inspected by professional quality control tools. The purpose is to summarize experience, improve maintenance efficiency, reduce the risk of using instruments, and serve clinical departments well.

RevDate: 2019-12-18

Reyna MA, Josef CS, Jeter R, et al (2019)

Early Prediction of Sepsis From Clinical Data: The PhysioNet/Computing in Cardiology Challenge 2019.

Critical care medicine [Epub ahead of print].

OBJECTIVES: Sepsis is a major public health concern with significant morbidity, mortality, and healthcare expenses. Early detection and antibiotic treatment of sepsis improve outcomes. However, although professional critical care societies have proposed new clinical criteria that aid sepsis recognition, the fundamental need for early detection and treatment remains unmet. In response, researchers have proposed algorithms for early sepsis detection, but directly comparing such methods has not been possible because of different patient cohorts, clinical variables and sepsis criteria, prediction tasks, evaluation metrics, and other differences. To address these issues, the PhysioNet/Computing in Cardiology Challenge 2019 facilitated the development of automated, open-source algorithms for the early detection of sepsis from clinical data.

DESIGN: Participants submitted containerized algorithms to a cloud-based testing environment, where we graded entries for their binary classification performance using a novel clinical utility-based evaluation metric. We designed this scoring function specifically for the Challenge to reward algorithms for early predictions and penalize them for late or missed predictions and for false alarms.

SETTING: ICUs in three separate hospital systems. We shared data from two systems publicly and sequestered data from all three systems for scoring.

PATIENTS: We sourced over 60,000 ICU patients with up to 40 clinical variables for each hour of a patient's ICU stay. We applied Sepsis-3 clinical criteria for sepsis onset.

INTERVENTIONS: None.

MEASUREMENTS AND MAIN RESULTS: A total of 104 groups from academia and industry participated, contributing 853 submissions. Furthermore, 90 abstracts based on Challenge entries were accepted for presentation at Computing in Cardiology.

CONCLUSIONS: Diverse computational approaches predict the onset of sepsis several hours before clinical recognition, but generalizability to different hospital systems remains a challenge.

RevDate: 2020-01-26
CmpDate: 2019-12-18

Wazid M, Das AK, Shetty S, et al (2019)

LDAKM-EIoT: Lightweight Device Authentication and Key Management Mechanism for Edge-Based IoT Deployment.

Sensors (Basel, Switzerland), 19(24):.

In recent years, edge computing has emerged as a new concept in the computing paradigm that empowers several future technologies, such as 5G, vehicle-to-vehicle communications, and the Internet of Things (IoT), by providing cloud computing facilities, as well as services to the end users. However, open communication among the entities in an edge based IoT environment makes it vulnerable to various potential attacks that are executed by an adversary. Device authentication is one of the prominent techniques in security that permits an IoT device to authenticate mutually with a cloud server with the help of an edge node. If authentication is successful, they establish a session key between them for secure communication. To achieve this goal, a novel device authentication and key management mechanism for the edge based IoT environment, called the lightweight authentication and key management scheme for the edge based IoT environment (LDAKM-EIoT), was designed. The detailed security analysis and formal security verification conducted by the widely used "Automated Validation of Internet Security Protocols and Applications (AVISPA)" tool prove that the proposed LDAKM-EIoT is secure against several attack vectors that exist in the infrastructure of the edge based IoT environment. The elaborated comparative analysis of the proposed LDAKM-EIoT and different closely related schemes provides evidence that LDAKM-EIoT is more secure with less communication and computation costs. Finally, the network performance parameters are calculated and analyzed using the NS2 simulation to demonstrate the practical facets of the proposed LDAKM-EIoT.

RevDate: 2020-01-24
CmpDate: 2019-12-18

Capella JV, Bonastre A, Ors R, et al (2019)

A New Application of Internet of Things and Cloud Services in Analytical Chemistry: Determination of Bicarbonate in Water.

Sensors (Basel, Switzerland), 19(24):.

In a constantly evolving world, new technologies such as Internet of Things (IoT) and cloud-based services offer great opportunities in many fields. In this paper we propose a new approach to the development of smart sensors using IoT and cloud computing, which open new interesting possibilities in analytical chemistry. According to IoT philosophy, these new sensors are able to integrate the generated data on the existing IoT platforms, so that information may be used whenever needed. Furthermore, the utilization of these technologies permits one to obtain sensors with significantly enhanced features using the information available in the cloud. To validate our new approach, a bicarbonate IoT-based smart sensor has been developed. A classical CO2 ion selective electrode (ISE) utilizes the pH information retrieved from the cloud and then provides an indirect measurement of bicarbonate concentration, which is offered to the cloud. The experimental data obtained are compared to those yielded by three other classical ISEs, with satisfactory results being achieved in most instances. Additionally, this methodology leads to lower-consumption, low-cost bicarbonate sensors capable of being employed within an IoT application, for instance in the continuous monitoring of HCO3- in rivers. Most importantly, this innovative application field of IoT and cloud approaches can be clearly perceived as an indicator for future developments over the short-term.

RevDate: 2019-12-17

Haghshenas H, Habibi J, MA Fazli (2019)

Parasite cloud service providers: on-demand prices on top of spot prices.

Heliyon, 5(11):e02877 pii:e02877.

On-demand resource provisioning and elasticity are two of the main characteristics of the cloud computing paradigm. As a result, the load on a cloud service provider (CSP) is not fixed and almost always a number of its physical resources are not used, called spare resources. As the CSPs typically don't want to be overprovisioned at any time, they procure physical resources in accordance to a pessimistic forecast of their loads and this leads to a large amount of spare resources most of the time. Some CSPs rent their spare resources with a lower price called the spot price, which varies over time with respect to the market or the internal state of the CSP. In this paper, we assume the spot price to be a function of the CSP's load. We introduce the concept of a parasite CSP, which rents spare resources from several CSPs simultaneously with spot prices and rents them to its customers with an on-demand price lower than the host CSPs' on-demand prices. We propose the overall architecture and interaction model of the parasite CSP. Mathematical analysis has been made to calculate the amount of spare resources of the host CSPs, the amount of resources that the parasite CSP can rent (its virtual capacity) as well as the probability of SLA violations. We evaluate our analysis over pricing data gathered from Amazon EC2 services. The results show that if the parasite CSP relies on several host CSPs, its virtual capacity can be considerable and the expected penalty due to SLA violation is acceptably low.

RevDate: 2020-01-08

Ganesan B, Gowda T, Al-Jumaily A, et al (2019)

Ambient assisted living technologies for older adults with cognitive and physical impairments: a review.

European review for medical and pharmacological sciences, 23(23):10470-10481.

The global number of people over the age of 60 years is expected to increase from 970 million to 2.1 billion in 2050 and 3.1 billion in 2100. About 80% of the aging population will be in the developing countries. Aging population may suffer from various physical, cognitive, and social problems, due to aging process such as impairment of physical related functions (decreased mobility and walking speed, falls, frailty, decreased walking speed, difficulties in basic, and instrumental activities of daily living), cognitive related functions (memory-related issues), sensory functions (hearing loss, cataracts and refractive errors, presbyopia, decreased vestibular function), behavioural and psychological disorders, social isolation issues, and poor quality of life. Over the period of the last few decades, emerging technologies such as internet of things (IoT), artificial intelligence (AI), sensors, cloud computing, wireless communication technologies, and assistive robotics have given the vision to develop various ambient or active assisted living (AAL) approaches for supporting an elderly people to live safely and independently in their living environment and participate in their daily and community activities, as well as supporting them to maintain their physical, mental health, and quality of their life. The aim of this paper is to review the use of Ambient or Active Assisted Living for older adults with physical, cognitive impairments, and their social participation.

RevDate: 2019-12-19

Wercelens P, da Silva W, Hondo F, et al (2019)

Bioinformatics Workflows With NoSQL Database in Cloud Computing.

Evolutionary bioinformatics online, 15:1176934319889974.

Scientific workflows can be understood as arrangements of managed activities executed by different processing entities. It is a regular Bioinformatics approach applying workflows to solve problems in Molecular Biology, notably those related to sequence analyses. Due to the nature of the raw data and the in silico environment of Molecular Biology experiments, apart from the research subject, 2 practical and closely related problems have been studied: reproducibility and computational environment. When aiming to enhance the reproducibility of Bioinformatics experiments, various aspects should be considered. The reproducibility requirements comprise the data provenance, which enables the acquisition of knowledge about the trajectory of data over a defined workflow, the settings of the programs, and the entire computational environment. Cloud computing is a booming alternative that can provide this computational environment, hiding technical details, and delivering a more affordable, accessible, and configurable on-demand environment for researchers. Considering this specific scenario, we proposed a solution to improve the reproducibility of Bioinformatics workflows in a cloud computing environment using both Infrastructure as a Service (IaaS) and Not only SQL (NoSQL) database systems. To meet the goal, we have built 3 typical Bioinformatics workflows and ran them on 1 private and 2 public clouds, using different types of NoSQL database systems to persist the provenance data according to the Provenance Data Model (PROV-DM). We present here the results and a guide for the deployment of a cloud environment for Bioinformatics exploring the characteristics of various NoSQL database systems to persist provenance data.

RevDate: 2020-01-24
CmpDate: 2019-12-17

Chi PW, MH Wang (2019)

Privacy-Preserving Broker-ABE Scheme for Multiple Cloud-Assisted Cyber Physical Systems.

Sensors (Basel, Switzerland), 19(24): pii:s19245463.

Cloud-assisted cyber-physical systems (CCPSs) integrate the physical space with cloud computing. To do so, sensors on the field collect real-life data and forward it to clouds for further data analysis and decision-making. Since multiple services may be accessed at the same time, sensor data should be forwarded to different cloud service providers (CSPs). In this scenario, attribute-based encryption (ABE) is an appropriate technique for securing data communication between sensors and clouds. Each cloud has its own attributes and a broker can determine which cloud is authorized to access data by the requirements set at the time of encryption. In this paper, we propose a privacy-preserving broker-ABE scheme for multiple CCPSs (MCCPS). The ABE separates the policy embedding job from the ABE task. To ease the computational burden of the sensors, this scheme leaves the policy embedding task to the broker, which is generally more powerful than the sensors. Moreover, the proposed scheme provides a way for CSPs to protect data privacy from outside coercion.

RevDate: 2020-01-26
CmpDate: 2019-12-17

Fan YC, Liu YC, CA Chu (2019)

Efficient CORDIC Iteration Design of LiDAR Sensors' Point-Cloud Map Reconstruction Technology.

Sensors (Basel, Switzerland), 19(24):.

In this paper, we propose an efficient COordinate Rotation DIgital Computer (CORDIC) iteration circuit design for Light Detection and Ranging (LiDAR) sensors. A novel CORDIC architecture that achieves the goal of pre-selecting angles and reduces the number of iterations is presented for LiDAR sensors. The value of the trigonometric functions can be found in seven rotations regardless of the number of input N digits. The number of iterations are reduced by more than half. The experimental results show the similarity value to be all 1 and prove that the LiDAR decoded packet results are exactly the same as the ground truth. The total chip area is 1.93 mm × 1.93 mm and the core area is 1.32 mm × 1.32 mm, separately. The number of logic gates is 129,688. The designed chip only takes 0.012 ms and 0.912 ms to decode a packet and a 3D frame of LiDAR sensors, respectively. The throughput of the chip is 8.2105 &nbsp; × &nbsp; 10 8 bits/sec. The average power consumption is 237.34 mW at a maximum operating frequency of 100 MHz. This design can not only reduce the number of iterations and the computing time but also reduce the chip area. This paper provides an efficient CORDIC iteration design and solution for LiDAR sensors to reconstruct the point-cloud map for autonomous vehicles.

RevDate: 2020-01-14

Shi L, Z Wang (2019)

Computational Strategies for Scalable Genomics Analysis.

Genes, 10(12):.

The revolution in next-generation DNA sequencing technologies is leading to explosive data growth in genomics, posing a significant challenge to the computing infrastructure and software algorithms for genomics analysis. Various big data technologies have been explored to scale up/out current bioinformatics solutions to mine the big genomics data. In this review, we survey some of these exciting developments in the applications of parallel distributed computing and special hardware to genomics. We comment on the pros and cons of each strategy in the context of ease of development, robustness, scalability, and efficiency. Although this review is written for an audience from the genomics and bioinformatics fields, it may also be informative for the audience of computer science with interests in genomics applications.

RevDate: 2020-01-24
CmpDate: 2019-12-11

Li H, Gu Z, Deng L, et al (2019)

A Fine-Grained Video Encryption Service Based on the Cloud-Fog-Local Architecture for Public and Private Videos.

Sensors (Basel, Switzerland), 19(24):.

With the advancement of cloud computing and fog computing, more and more services and data are being moved from local servers to the fog and cloud for processing and storage. Videos are an important part of this movement. However, security issues involved in video moving have drawn wide attention. Although many video-encryption algorithms have been developed to protect local videos, these algorithms fail to solve the new problems faced on the media cloud, such as how to provide a video encryption service to devices with low computing power, how to meet the different encryption requirements for different type of videos, and how to ensure massive video encryption efficiency. To solve these three problems, we propose a cloud-fog-local video encryption framework which consists of a three-layer service model and corresponding key management strategies, a fine-grain video encryption algorithm based on the network abstract layer unit (NALU), and a massive video encryption framework based on Spark. The experiment proves that our proposed solution can meet the different encryption requirements for public videos and private videos. Moreover, in the experiment environment, our encryption algorithm for public videos reaches a speed of 1708 Mbps, and can provide a real-time encryption service for at least 42 channels of 4K-resolution videos.

RevDate: 2020-01-08
CmpDate: 2019-12-11

Wang T, Lu Y, Cao Z, et al (2019)

When Sensor-Cloud Meets Mobile Edge Computing.

Sensors (Basel, Switzerland), 19(23):.

Sensor-clouds are a combination of wireless sensor networks (WSNs) and cloud computing. The emergence of sensor-clouds has greatly enhanced the computing power and storage capacity of traditional WSNs via exploiting the advantages of cloud computing in resource utilization. However, there are still many problems to be solved in sensor-clouds, such as the limitations of WSNs in terms of communication and energy, the high latency, and the security and privacy issues due to applying a cloud platform as the data processing and control center. In recent years, mobile edge computing has received increasing attention from industry and academia. The core of mobile edge computing is to migrate some or all of the computing tasks of the original cloud computing center to the vicinity of the data source, which gives mobile edge computing great potential in solving the shortcomings of sensor-clouds. In this paper, the latest research status of sensor-clouds is briefly analyzed and the characteristics of the existing sensor-clouds are summarized. After that we discuss the issues of sensor-clouds and propose some applications, especially a trust evaluation mechanism and trustworthy data collection which use mobile edge computing to solve the problems in sensor-clouds. Finally, we discuss research challenges and future research directions in leveraging mobile edge computing for sensor-clouds.

RevDate: 2019-12-26

Oliveira ASF, Edsall CJ, Woods CJ, et al (2019)

A General Mechanism for Signal Propagation in the Nicotinic Acetylcholine Receptor Family.

Journal of the American Chemical Society, 141(51):19953-19958.

Nicotinic acetylcholine receptors (nAChRs) modulate synaptic activity in the central nervous system. The α7 subtype, in particular, has attracted considerable interest in drug discovery as a target for several conditions, including Alzheimer's disease and schizophrenia. Identifying agonist-induced structural changes underlying nAChR activation is fundamentally important for understanding biological function and rational drug design. Here, extensive equilibrium and nonequilibrium molecular dynamics simulations, enabled by cloud-based high-performance computing, reveal the molecular mechanism by which structural changes induced by agonist unbinding are transmitted within the human α7 nAChR. The simulations reveal the sequence of coupled structural changes involved in driving conformational change responsible for biological function. Comparison with simulations of the α4β2 nAChR subtype identifies features of the dynamical architecture common to both receptors, suggesting a general structural mechanism for signal propagation in this important family of receptors.

RevDate: 2019-12-09

Chung H, Jeong C, Luhach AK, et al (2019)

Remote Pulmonary Function Test Monitoring in Cloud Platform via Smartphone Built-in Microphone.

Evolutionary bioinformatics online, 15:1176934319888904.

With an aging population that continues to grow, health care technology plays an increasingly active role, especially for chronic disease management. In the health care market, cloud platform technology is becoming popular, as both patients and physicians demand cost efficiency, easy access to information, and security. Especially for asthma and chronic obstructive pulmonary disease (COPD) patients, it is recommended that pulmonary function test (PFT) be performed on a daily basis. However, it is difficult for patients to frequently visit a hospital to perform the PFT. In this study, we present an application and cloud platform for remote PFT monitoring that can be directly measured by smartphone microphone with no external devices. In addition, we adopted the IBM Watson Internet-of-Things (IoT) platform for PFT monitoring, using a smartphone's built-in microphone with a high-resolution time-frequency representation. We successfully demonstrated real-time PFT monitoring using the cloud platform. The PFT parameters of FEV1/FVC (%) could be remotely monitored when a subject performed the PFT test. As a pilot study, we tested 13 healthy subjects, and found that the absolute error mean was 4.12 and the standard deviation was 3.45 on all 13 subjects. With the developed applications on the cloud platform, patients can freely measure the PFT parameters without restriction on time and space, and a physician can monitor the patients' status in real time. We hope that the PFT monitoring platform will work as a means for early detection and treatment of patients with pulmonary diseases, especially those having asthma and COPD.

RevDate: 2020-01-08

Olatinwo DD, Abu-Mahfouz A, G Hancke (2019)

A Survey on LPWAN Technologies in WBAN for Remote Health-Care Monitoring.

Sensors (Basel, Switzerland), 19(23):.

In ubiquitous health-care monitoring (HCM), wireless body area networks (WBANs) are envisioned as appealing solutions that may offer reliable methods for real-time monitoring of patients' health conditions by employing the emerging communication technologies. This paper therefore focuses more on the state-of-the-art wireless communication systems that can be explored in the next-generation WBAN solutions for HCM. Also, this study addressed the critical issues confronted by the existing WBANs that are employed in HCM. Examples of such issues include wide-range health data communication constraint, health data delivery reliability concern, and energy efficiency, which are attributed to the limitations of the legacy short range, medium range, and the cellular technologies that are typically employed in WBAN systems. Since the WBAN sensor devices are usually configured with a finite battery power, they often get drained during prolonged operations. This phenomenon is technically exacerbated by the fact that the legacy communication systems, such as ZigBee, Bluetooth, 6LoWPAN, and so on, consume more energy during data communications. This unfortunate situation offers a scope for employing suitable communication systems identified in this study to improve the productivity of WBANs in HCM. For this to be achieved, the emerging communication systems such as the low-power wide-area networks (LPWANs) are investigated in this study based on their power transmission, data transmission rate, data reliability in the context of efficient data delivery, communication coverage, and latency, including their advantages, as well as disadvantages. As a consequence, the LPWAN solutions are presented for WBAN systems in remote HCM. Furthermore, this research work also points out future directions for the realization of the next-generation of WBANs, as well as how to improve the identified communication systems, to further enhance their productivity in WBAN solutions for HCM.

RevDate: 2020-01-08
CmpDate: 2019-12-06

Shi P, Li N, Wang S, et al (2019)

Quantum Multi-User Broadcast Protocol for the "Platform as a Service" Model.

Sensors (Basel, Switzerland), 19(23): pii:s19235257.

Quantum Cloud Computing is the technology which has the capability to shape the future of computing. In "Platform as a Service (PaaS)" type of cloud computing, the development environment is delivered as a service. In this paper, a multi-user broadcast protocol in network is developed with the mode of one master and N slaves together with a sequence of single photons. It can be applied to a multi-node network, in which a single photon sequence can be sent to all the slave nodes simultaneously. In broadcast communication networks, these single photons encode classical information directly through noisy quantum communication channels. The results show that this protocol can realize the secret key generation and sharing of multiple nodes. The protocol we propose is also proved to be unconditionally secure in theory, which indicates its feasibility in theoretical application.

RevDate: 2020-01-08
CmpDate: 2019-12-09

Gonzalez LF, Vidal I, Valera F, et al (2019)

Transport-Layer Limitations for NFV Orchestration in Resource-Constrained Aerial Networks.

Sensors (Basel, Switzerland), 19(23):.

In this paper, we identify the main challenges and problems related with the management and orchestration of Virtualized Network Functions (VNFs) over aerial networks built with Small Unmanned Aerial Vehicles (SUAVs). Our analysis starts from a reference scenario, where several SUAVs are deployed over a delimited geographic area, and provide a mobile cloud environment that supports the deployment of functions and services using Network Functions Virtualization (NFV) technologies. After analyzing the main challenges to NFV orchestration in this reference scenario from a theoretical perspective, we undertake the study of one specific but relevant aspect following a practical perspective, i.e., the limitations of existing transport-layer solutions to support the dissemination of NFV management and orchestration information in the considered scenario. While in traditional cloud computing environments this traffic is delivered using TCP, our simulation results suggest that using this protocol over an aerial network of SUAVs presents certain limitations. Finally, based on the lessons learned from our practical analysis, the paper outlines different alternatives that could be followed to address these challenges.

RevDate: 2020-01-08

Manocha A, Singh R, M Bhatia (2019)

Cognitive Intelligence Assisted Fog-Cloud Architecture for Generalized Anxiety Disorder (GAD) Prediction.

Journal of medical systems, 44(1):7.

Generalized Anxiety Disorder (GAD) is a psychological disorder caused by high stress from daily life activities. It causes severe health issues, such as sore muscles, low concentration, fatigue, and sleep deprivation. The less availability of predictive solutions specifically for individuals suffering from GAD can become an imperative reason for health and psychological adversity. The proposed solution aims to monitor health, behavioral and environmental parameters of the individual to predict health adversity caused by GAD. Initially, Weighted-Naïve Bayes (W-NB) classifier is utilized to predict irregular health events by classifying the captured data at the fog layer. The proposed two-phased decision-making process helps to optimize the distribution of required medical services by determining the scale of vulnerability. Furthermore, the utility of the framework is increased by calculating health vulnerability index using Adaptive Neuro-Fuzzy Inference System-Genetic Algorithm (ANFIS-GA) on the cloud. The presented work addresses the concerns in terms of efficient monitoring of anomalies followed by time sensitive two-phased alert generation procedure. To approve the performance of irregular event identification and health severity prediction, the framework has been conveyed in a living room for 30 days in which almost 15 individuals by the age of 68 to 78 years have been continuously monitored. The calculated outcomes represent the monitoring efficiency of the proposed framework over the policies of manual monitoring.

RevDate: 2020-01-08
CmpDate: 2019-12-02

Kamolov A, S Park (2019)

An IoT-Based Ship Berthing Method Using a Set of Ultrasonic Sensors.

Sensors (Basel, Switzerland), 19(23):.

It is indisputable that a great deal of brand new technologies such as the internet of things, (IoT) big data, and cloud computing are conquering every aspect of our life. So, in the branch of marine technology, the mentioned technologies are also being applied to obtain more features and to automate marine-related operations as well as creating novel smart devices. As a result of this, traditional ports and ships are being replaced by smart ports and vessels. To achieve this transition, numerous applications need to be developed to make them smart. The purpose of this paper is to present a dedicated an IoT-based system for automating linkage procedures by searching for available locations via port-mounted sensors and planned ship notification. In the experimental system, we have used smartphone as an alternative to the client-side vessel of the system and created an Android app called "Smart Ship Berthing" instead of the charging program, for instance, NORIVIS 4, VDASH, ODYSSEY, etc. To test our proposed server-side system, we used Raspberry Pi with a combination of an ultrasonic sensor to detect the ship and modify the empty berth for anchoring. The experimental results show that the set of UR sensors have high accuracy to detect ships at the port for ship berthing and our proposed system is very amenable to implementation in the real marine environment.

RevDate: 2020-01-08
CmpDate: 2019-12-02

Shallari I, M O'Nils (2019)

From the Sensor to the Cloud: Intelligence Partitioning for Smart Camera Applications.

Sensors (Basel, Switzerland), 19(23): pii:s19235162.

The Internet of Things has grown quickly in the last few years, with a variety of sensing, processing and storage devices interconnected, resulting in high data traffic. While some sensors such as temperature, or humidity sensors produce a few bits of data periodically, imaging sensors output data in the range of megabytes every second. This raises a complexity for battery operated smart cameras, as they would be required to perform intensive image processing operations on large volumes of data, within energy consumption constraints. By using intelligence partitioning we analyse the effects of different partitioning scenarios for the processing tasks between the smart camera node, the fog computing layer and cloud computing, in the node energy consumption as well as the real time performance of the WVSN (Wireless Vision Sensor Node). The results obtained show that traditional design space exploration approaches are inefficient for WVSN, while intelligence partitioning enhances the energy consumption performance of the smart camera node and meets the timing constraints.

RevDate: 2020-01-08

Shih DH, Wu TW, Liu WX, et al (2019)

An Azure ACES Early Warning System for Air Quality Index Deteriorating.

International journal of environmental research and public health, 16(23):.

With the development of industrialization and urbanization, air pollution in many countries has become more serious and has affected people's health. The air quality has been continuously concerned by environmental managers and the public. Therefore, accurate air quality deterioration warning system can avoid health hazards. In this study, an air quality index (AQI) warning system based on Azure cloud computing platform is proposed. The prediction model is based on DFR (Decision Forest Regression), NNR (Neural Network Regression), and LR (Linear Regression) machine learning algorithms. The best algorithm was selected to calculate the 6 pollutants required for the AQI calculation of the air quality monitoring in real time. The experimental results show that the LR algorithm has the best performance, and the method of this study has a good prediction on the AQI index warning for the next one to three hours. Based on the ACES system proposed, it is hoped that it can prevent personal health hazards and help to reduce medical costs in public.

RevDate: 2020-01-08
CmpDate: 2019-11-26

McLamore ES, Palit Austin Datta S, Morgan V, et al (2019)

SNAPS: Sensor Analytics Point Solutions for Detection and Decision Support Systems.

Sensors (Basel, Switzerland), 19(22):.

In this review, we discuss the role of sensor analytics point solutions (SNAPS), a reduced complexity machine-assisted decision support tool. We summarize the approaches used for mobile phone-based chemical/biological sensors, including general hardware and software requirements for signal transduction and acquisition. We introduce SNAPS, part of a platform approach to converge sensor data and analytics. The platform is designed to consist of a portfolio of modular tools which may lend itself to dynamic composability by enabling context-specific selection of relevant units, resulting in case-based working modules. SNAPS is an element of this platform where data analytics, statistical characterization and algorithms may be delivered to the data either via embedded systems in devices, or sourced, in near real-time, from mist, fog or cloud computing resources. Convergence of the physical systems with the cyber components paves the path for SNAPS to progress to higher levels of artificial reasoning tools (ART) and emerge as data-informed decision support, as a service for general societal needs. Proof of concept examples of SNAPS are demonstrated both for quantitative data and qualitative data, each operated using a mobile device (smartphone or tablet) for data acquisition and analytics. We discuss the challenges and opportunities for SNAPS, centered around the value to users/stakeholders and the key performance indicators users may find helpful, for these types of machine-assisted tools.

RevDate: 2020-01-28

Li W, Feng C, Yu K, et al (2019)

MISS-D: A fast and scalable framework of medical image storage service based on distributed file system.

Computer methods and programs in biomedicine, 186:105189 pii:S0169-2607(19)30524-3 [Epub ahead of print].

Background and Objective Processing of medical imaging big data is deeply challenging due to the size of data, computational complexity, security storage and inherent privacy issues. Traditional picture archiving and communication system, which is an imaging technology used in the healthcare industry, generally uses centralized high performance disk storage arrays in the practical solutions. The existing storage solutions are not suitable for the diverse range of medical imaging big data that needs to be stored reliably and accessed in a timely manner. The economical solution is emerging as the cloud computing which provides scalability, elasticity, performance and better managing cost. Cloud based storage architecture for medical imaging big data has attracted more and more attention in industry and academia. Methods This study presents a novel, fast and scalable framework of medical image storage service based on distributed file system. Two innovations of the framework are introduced in this paper. An integrated medical imaging content indexing file model for large-scale image sequence is designed to adapt to the high performance storage efficiency on distributed file system. A virtual file pooling technology is proposed, which uses the memory-mapped file method to achieve an efficient data reading process and provides the data swapping strategy in the pool. Result The experiments show that the framework not only has comparable performance of reading and writing files which meets requirements in real-time application domain, but also bings greater convenience for clinical system developers by multiple client accessing types. The framework supports different user client types through the unified micro-service interfaces which basically meet the needs of clinical system development especially for online applications. The experimental results demonstrate the framework can meet the needs of real-time data access as well as traditional picture archiving and communication system. Conclusions This framework aims to allow rapid data accessing for massive medical images, which can be demonstrated by the online web client for MISS-D framework implemented in this paper for real-time data interaction. The framework also provides a substantial subset of features to existing open-source and commercial alternatives, which has a wide range of potential applications.

RevDate: 2019-11-19

Koumpouros Y, A Georgoulas (2019)

A systematic review of mHealth funded R&D activities in EU: Trends, technologies and obstacles.

Informatics for health & social care [Epub ahead of print].

OBJECTIVE: This study provides a systematic review of EU-funded mHealth projects.

METHODS: The review was conducted based mainly on the Projects and Results service provided by the EU Open Data Portal. Even though the search strategy yielded a large number of results, only 45 projects finally met all the inclusion criteria.

RESULTS: The review results reveal useful information regarding mHealth solutions and trends that emerge nowadays in the EU, the diseases addressed, the level of adoption by users and providers, the technological approaches, the projects' structure, and the overall impact. New areas of application, like behavioral intervention approaches as well as an apparent trend towards affective computing, big data, cloud computing, open standards and platforms have also been recognized and recorded. Core legal issues with regard to data security and privacy still pose challenges to mHealth projects, while commercialization of the developed solutions is slow. Interdisciplinary consortia with the participation of a significant number of SMEs and public healthcare organizations are also key factors for a successful project.

CONCLUSION: The study provides researchers and decision-makers with a complete and systematically organized knowledge base in order to plan new mHealth initiatives.

RevDate: 2019-12-20

Brewer P, A Ratan (2019)

Data and replication supplement for double auction markets with snipers.

Data in brief, 27:104729.

We provide a dataset for our research article "Profitability, Efficiency and Inequality in Double Auction Markets with Snipers" [1]. This dataset [2] includes configuration files, raw output data, and replications of calculated metrics for our robot-populated market simulations. The raw data is subdivided into a hierarchy of folders corresponding to simulation treatment variables, in a 2 × 2 × 21 design for 84 treatments in total. Treatments variables include: (i) robot population ordering, either "primary" or "reverse"; (ii) two market schedules of agent's values and costs: equal-expected-profit "market 1" and unequal-expected-profit "market 2"; (iii) 21 robot populations identified by the number of Sniper Bots (0-20) on each side of the market. Each treatment directory contains a simulator input file and outputs for 10,000 periods of market data. The outputs include all acceptable buy and sell orders, all trades, profits for each agent, and market metrics such as efficiency-of-allocation, Gini coefficient, and price statistics. An additional public copy in Google Cloud is available for database query by users of Google BigQuery. The market simulator software is a private product created by Paul Brewer at Economic and Financial Technology Consulting LLC. Free open source modules are available for tech-savvy users at GitHub, NPM, and Docker Hub repositories and are sufficient to repeat the simulations. An easier-to-use paid market simulation product will eventually be available online from Econ1.Net. We provide instructions for repeating individual simulations using the free open source simulator and the free container tool Docker.

RevDate: 2020-01-08

Bhandari M, Zeffiro T, M Reddiboina (2020)

Artificial intelligence and robotic surgery: current perspective and future directions.

Current opinion in urology, 30(1):48-54.

PURPOSE OF REVIEW: This review aims to draw a road-map to the use of artificial intelligence in an era of robotic surgery and highlight the challenges inherent to this process.

RECENT FINDINGS: Conventional mechanical robots function by transmitting actions of the surgeon's hands to the surgical target through the tremor-filtered movements of surgical instruments. Similarly, the next iteration of surgical robots conform human-initiated actions to a personalized surgical plan leveraging 3D digital segmentation generated prior to surgery. The advancements in cloud computing, big data analytics, and artificial intelligence have led to increased research and development of intelligent robots in all walks of human life. Inspired by the successful application of deep learning, several surgical companies are joining hands with tech giants to develop intelligent surgical robots. We, hereby, highlight key steps in the handling and analysis of big data to build, define, and deploy deep-learning models for building autonomous robots.

SUMMARY: Despite tremendous growth of autonomous robotics, their entry into the operating room remains elusive. It is time that surgeons actively collaborate for the development of the next generation of intelligent robotic surgery.

RevDate: 2019-11-26

Shukla S, Hassan MF, Khan MK, et al (2019)

An analytical model to minimize the latency in healthcare internet-of-things in fog computing environment.

PloS one, 14(11):e0224934.

Fog computing (FC) is an evolving computing technology that operates in a distributed environment. FC aims to bring cloud computing features close to edge devices. The approach is expected to fulfill the minimum latency requirement for healthcare Internet-of-Things (IoT) devices. Healthcare IoT devices generate various volumes of healthcare data. This large volume of data results in high data traffic that causes network congestion and high latency. An increase in round-trip time delay owing to large data transmission and large hop counts between IoTs and cloud servers render healthcare data meaningless and inadequate for end-users. Time-sensitive healthcare applications require real-time data. Traditional cloud servers cannot fulfill the minimum latency demands of healthcare IoT devices and end-users. Therefore, communication latency, computation latency, and network latency must be reduced for IoT data transmission. FC affords the storage, processing, and analysis of data from cloud computing to a network edge to reduce high latency. A novel solution for the abovementioned problem is proposed herein. It includes an analytical model and a hybrid fuzzy-based reinforcement learning algorithm in an FC environment. The aim is to reduce high latency among healthcare IoTs, end-users, and cloud servers. The proposed intelligent FC analytical model and algorithm use a fuzzy inference system combined with reinforcement learning and neural network evolution strategies for data packet allocation and selection in an IoT-FC environment. The approach is tested on simulators iFogSim (Net-Beans) and Spyder (Python). The obtained results indicated the better performance of the proposed approach compared with existing methods.

RevDate: 2019-11-13

Mehdipoor H, Zurita-Milla R, Augustijn EW, et al (2019)

Exploring differences in spatial patterns and temporal trends of phenological models at continental scale using gridded temperature time-series.

International journal of biometeorology pii:10.1007/s00484-019-01826-7 [Epub ahead of print].

Phenological models are widely used to estimate the influence of weather and climate on plant development. The goodness of fit of phenological models often is assessed by considering the root-mean-square error (RMSE) between observed and predicted dates. However, the spatial patterns and temporal trends derived from models with similar RMSE may vary considerably. In this paper, we analyse and compare patterns and trends from a suite of temperature-based phenological models, namely extended spring indices, thermal time and photothermal time models. These models were first calibrated using lilac leaf onset observations for the period 1961-1994. Next, volunteered phenological observations and daily gridded temperature data were used to validate the models. After that, the two most accurate models were used to evaluate the patterns and trends of leaf onset for the conterminous US over the period 2000-2014. Our results show that the RMSEs of extended spring indices and thermal time models are similar and about 2 days lower than those produced by the other models. Yet the dates of leaf out produced by each of the models differ by up to 11 days, and the trends differ by up to a week per decade. The results from the histograms and difference maps show that the statistical significance of these trends strongly depends on the type of model applied. Therefore, further work should focus on the development of metrics that can quantify the difference between patterns and trends derived from spatially explicit phenological models. Such metrics could subsequently be used to validate phenological models in both space and time. Also, such metrics could be used to validate phenological models in both space and time.

RevDate: 2020-01-08
CmpDate: 2019-11-18

Xu R, Jin W, D Kim (2019)

Microservice Security Agent Based On API Gateway in Edge Computing.

Sensors (Basel, Switzerland), 19(22):.

Internet of Things (IoT) devices are embedded with software, electronics, and sensors, and feature connectivity with constrained resources. They require the edge computing paradigm, with modular characteristics relying on microservices, to provide an extensible and lightweight computing framework at the edge of the network. Edge computing can relieve the burden of centralized cloud computing by performing certain operations, such as data storage and task computation, at the edge of the network. Despite the benefits of edge computing, it can lead to many challenges in terms of security and privacy issues. Thus, services that protect privacy and secure data are essential functions in edge computing. For example, the end user's ownership and privacy information and control are separated, which can easily lead to data leakage, unauthorized data manipulation, and other data security concerns. Thus, the confidentiality and integrity of the data cannot be guaranteed and, so, more secure authentication and access mechanisms are required to ensure that the microservices are exposed only to authorized users. In this paper, we propose a microservice security agent to integrate the edge computing platform with the API gateway technology for presenting a secure authentication mechanism. The aim of this platform is to afford edge computing clients a practical application which provides user authentication and allows JSON Web Token (JWT)-based secure access to the services of edge computing. To integrate the edge computing platform with the API gateway, we implement a microservice security agent based on the open-source Kong in the EdgeX Foundry framework. Also to provide an easy-to-use approach with Kong, we implement REST APIs for generating new consumers, registering services, configuring access controls. Finally, the usability of the proposed approach is demonstrated by evaluating the round trip time (RTT). The results demonstrate the efficiency of the system and its suitability for real-world applications.

RevDate: 2020-01-08

Luo Y, Hitz BC, Gabdank I, et al (2020)

New developments on the Encyclopedia of DNA Elements (ENCODE) data portal.

Nucleic acids research, 48(D1):D882-D889.

The Encyclopedia of DNA Elements (ENCODE) is an ongoing collaborative research project aimed at identifying all the functional elements in the human and mouse genomes. Data generated by the ENCODE consortium are freely accessible at the ENCODE portal (https://www.encodeproject.org/), which is developed and maintained by the ENCODE Data Coordinating Center (DCC). Since the initial portal release in 2013, the ENCODE DCC has updated the portal to make ENCODE data more findable, accessible, interoperable and reusable. Here, we report on recent updates, including new ENCODE data and assays, ENCODE uniform data processing pipelines, new visualization tools, a dataset cart feature, unrestricted public access to ENCODE data on the cloud (Amazon Web Services open data registry, https://registry.opendata.aws/encode-project/) and more comprehensive tutorials and documentation.

RevDate: 2020-01-08

Bai J, Jhaney I, J Wells (2019)

Developing a Reproducible Microbiome Data Analysis Pipeline Using the Amazon Web Services Cloud for a Cancer Research Group: Proof-of-Concept Study.

JMIR medical informatics, 7(4):e14667.

BACKGROUND: Cloud computing for microbiome data sets can significantly increase working efficiencies and expedite the translation of research findings into clinical practice. The Amazon Web Services (AWS) cloud provides an invaluable option for microbiome data storage, computation, and analysis.

OBJECTIVE: The goals of this study were to develop a microbiome data analysis pipeline by using AWS cloud and to conduct a proof-of-concept test for microbiome data storage, processing, and analysis.

METHODS: A multidisciplinary team was formed to develop and test a reproducible microbiome data analysis pipeline with multiple AWS cloud services that could be used for storage, computation, and data analysis. The microbiome data analysis pipeline developed in AWS was tested by using two data sets: 19 vaginal microbiome samples and 50 gut microbiome samples.

RESULTS: Using AWS features, we developed a microbiome data analysis pipeline that included Amazon Simple Storage Service for microbiome sequence storage, Linux Elastic Compute Cloud (EC2) instances (ie, servers) for data computation and analysis, and security keys to create and manage the use of encryption for the pipeline. Bioinformatics and statistical tools (ie, Quantitative Insights Into Microbial Ecology 2 and RStudio) were installed within the Linux EC2 instances to run microbiome statistical analysis. The microbiome data analysis pipeline was performed through command-line interfaces within the Linux operating system or in the Mac operating system. Using this new pipeline, we were able to successfully process and analyze 50 gut microbiome samples within 4 hours at a very low cost (a c4.4xlarge EC2 instance costs $0.80 per hour). Gut microbiome findings regarding diversity, taxonomy, and abundance analyses were easily shared within our research team.

CONCLUSIONS: Building a microbiome data analysis pipeline with AWS cloud is feasible. This pipeline is highly reliable, computationally powerful, and cost effective. Our AWS-based microbiome analysis pipeline provides an efficient tool to conduct microbiome data analysis.

RevDate: 2020-01-21

Lo Piparo E, Siragusa L, Raymond F, et al (2020)

Bisphenol A binding promiscuity: A virtual journey through the universe of proteins.

ALTEX, 37(1):85-94.

Significant efforts are currently being made to move toxicity testing from animal experimentation to human relevant, mechanism-based approaches. In this context, the identification of molecular target(s) responsible for mechanisms of action is an essential step. Inspired by the recent concept of polypharmacology (the ability of drugs to interact with multiple targets) we argue that whole proteome virtual screening might become a breakthrough tool in toxicology reflecting the real complexity of chemical-biological interactions. Therefore, we investigated the value of performing ligand-protein binding prediction screening across the full proteome to identify new mechanisms of action for food chemicals. We applied the new approach to make a broader comparison between bisphenol A (BPA) (food-packaging chemical) and the endogenous estrogen, 17β-estradiol (EST). Applying a novel high-throughput ligand-protein binding prediction tool (BioGPS) by the Amazon Web Services (AWS) cloud (to speed-up the calculation), we investigated the value of performing in silico screening across the full proteome (all human and rodent x-ray protein structures available in the Protein Data Bank). The strong correlation between in silico predictions and available in vitro data demonstrates the high predictive power of the method used. The most striking results obtained was that BPA was predicted to bind to many more proteins than the ones already known, most of which were common to EST. Our findings provide a new and unprecedented insight on the complexity of chemical-protein interactions, highlighting the binding promiscuity of BPA and its broader similarity compared to the female sex hormone, EST.

RevDate: 2020-01-10
CmpDate: 2020-01-10

Heldenbrand JR, Baheti S, Bockol MA, et al (2019)

Recommendations for performance optimizations when using GATK3.8 and GATK4.

BMC bioinformatics, 20(1):557.

BACKGROUND: Use of the Genome Analysis Toolkit (GATK) continues to be the standard practice in genomic variant calling in both research and the clinic. Recently the toolkit has been rapidly evolving. Significant computational performance improvements have been introduced in GATK3.8 through collaboration with Intel in 2017. The first release of GATK4 in early 2018 revealed rewrites in the code base, as the stepping stone toward a Spark implementation. As the software continues to be a moving target for optimal deployment in highly productive environments, we present a detailed analysis of these improvements, to help the community stay abreast with changes in performance.

RESULTS: We re-evaluated multiple options, such as threading, parallel garbage collection, I/O options and data-level parallelization. Additionally, we considered the trade-offs of using GATK3.8 and GATK4. We found optimized parameter values that reduce the time of executing the best practices variant calling procedure by 29.3% for GATK3.8 and 16.9% for GATK4. Further speedups can be accomplished by splitting data for parallel analysis, resulting in run time of only a few hours on whole human genome sequenced to the depth of 20X, for both versions of GATK. Nonetheless, GATK4 is already much more cost-effective than GATK3.8. Thanks to significant rewrites of the algorithms, the same analysis can be run largely in a single-threaded fashion, allowing users to process multiple samples on the same CPU.

CONCLUSIONS: In time-sensitive situations, when a patient has a critical or rapidly developing condition, it is useful to minimize the time to process a single sample. In such cases we recommend using GATK3.8 by splitting the sample into chunks and computing across multiple nodes. The resultant walltime will be nnn.4 hours at the cost of $41.60 on 4 c5.18xlarge instances of Amazon Cloud. For cost-effectiveness of routine analyses or for large population studies, it is useful to maximize the number of samples processed per unit time. Thus we recommend GATK4, running multiple samples on one node. The total walltime will be ∼34.1 hours on 40 samples, with 1.18 samples processed per hour at the cost of $2.60 per sample on c5.18xlarge instance of Amazon Cloud.

RevDate: 2020-01-08

Lu TJ, Zhong X, Zhong L, et al (2019)

A location-aware feature extraction algorithm for image recognition in mobile edge computing.

Mathematical biosciences and engineering : MBE, 16(6):6672-6682.

With the explosive growth of mobile devices, it is feasible to deploy image recognition applications on mobile devices to provide image recognition services. However, traditional mobile cloud computing architecture cannot meet the demands of real time response and high accuracy since users require to upload raw images to the remote central cloud servers. The emerging architecture, Mobile Edge Computing (MEC) deploys small scale servers at the edge of the network, which can provide computing and storage resources for image recognition applications. To this end, in this paper, we aim to use the MEC architecture to provide image recognition service. Moreover, in order to guarantee the real time response and high accuracy, we also provide a feature extraction algorithm to extract discriminative features from the raw image to improve the accuracy of the image recognition applications. In doing so, the response time can be further reduced and the accuracy can be improved. The experimental results show that the combination between MEC architecture and the proposed feature extraction algorithm not only can greatly reduce the response time, but also improve the accuracy of the image recognition applications.

RevDate: 2020-01-08
CmpDate: 2019-11-13

Basir R, Qaisar S, Ali M, et al (2019)

Fog Computing Enabling Industrial Internet of Things: State-of-the-Art and Research Challenges.

Sensors (Basel, Switzerland), 19(21):.

Industry is going through a transformation phase, enabling automation and data exchange in manufacturing technologies and processes, and this transformation is called Industry 4.0. Industrial Internet-of-Things (IIoT) applications require real-time processing, near-by storage, ultra-low latency, reliability and high data rate, all of which can be satisfied by fog computing architecture. With smart devices expected to grow exponentially, the need for an optimized fog computing architecture and protocols is crucial. Therein, efficient, intelligent and decentralized solutions are required to ensure real-time connectivity, reliability and green communication. In this paper, we provide a comprehensive review of methods and techniques in fog computing. Our focus is on fog infrastructure and protocols in the context of IIoT applications. This article has two main research areas: In the first half, we discuss the history of industrial revolution, application areas of IIoT followed by key enabling technologies that act as building blocks for industrial transformation. In the second half, we focus on fog computing, providing solutions to critical challenges and as an enabler for IIoT application domains. Finally, open research challenges are discussed to enlighten fog computing aspects in different fields and technologies.

RevDate: 2019-11-05

Vargas-Salgado C, Aguila-Leon J, Chiñas-Palacios C, et al (2019)

Low-cost web-based Supervisory Control and Data Acquisition system for a microgrid testbed: A case study in design and implementation for academic and research applications.

Heliyon, 5(9):e02474 pii:e02474.

This paper presents the design and implementation of a low-cost Supervisory Control and Data Acquisition system based on a Web interface to be applied to a Hybrid Renewable Energy System (HRES) microgrid. This development will provide a reliable and low-cost control and data acquisition systems for the Renewable Energy Laboratory at Universitat Politècnica de València (LabDER-UPV) in Spain, oriented to the research on microgrid stability and energy generation. The developed low-cost SCADA operates on a microgrid that incorporates a photovoltaic array, a wind turbine, a biomass gasification plant and a battery bank as an energy storage system. Sensors and power meters for electrical parameters, such as voltage, current, frequency, power factor, power generation, and energy consumption, were processed digitally and integrated into Arduino-based devices. A master device on a Raspberry-PI board was set up to send all this information to a local database (DB), and a MySQL Web-DB linked to a Web SCADA interface, programmed in HTML5. The communications protocols include TCP/IP, I2C, SPI, and Serial communication; Arduino-based slave devices communicate with the master Raspberry-PI using NRF24L01 wireless radio frequency transceivers. Finally, a comparison between a standard SCADA against the developed Web-based SCADA system is carried out. The results of the operative tests and the cost comparison of the own-designed developed Web-SCADA system prove its reliability and low-cost, on average an 86% cheaper than a standard brandmark solution, for controlling, monitoring and data logging information, as well as for local and remote operation system when applied to the HRES microgrid testbed.

RevDate: 2019-11-14

Grebner C, Malmerberg E, Shewmaker A, et al (2019)

Virtual Screening in the Cloud: How Big Is Big Enough?.

Journal of chemical information and modeling [Epub ahead of print].

Virtual screening is a standard tool in Computer-Assisted Drug Design (CADD). Early in a project, it is typical to use ligand-based similarity search methods to find suitable hit molecules. However, the number of compounds which can be screened and the time required are usually limited by computational resources. We describe here a high-throughput virtual screening project using 3D similarity (FastROCS) and automated evaluation workflows on Orion, a cloud computing platform. Cloud resources make this approach fully scalable and flexible, allowing the generation and search of billions of virtual molecules, and give access to an explicit 3D virtual chemistry space not available before. We discuss the impact of the size of the search space with respect to finding novel chemical hits and the size of the required hit list, as well as computational and economical aspects of resource scaling.

RevDate: 2019-11-08

Ongari D, Yakutovich AV, Talirz L, et al (2019)

Building a Consistent and Reproducible Database for Adsorption Evaluation in Covalent-Organic Frameworks.

ACS central science, 5(10):1663-1675.

We present a workflow that traces the path from the bulk structure of a crystalline material to assessing its performance in carbon capture from coal's postcombustion flue gases. This workflow is applied to a database of 324 covalent-organic frameworks (COFs) reported in the literature, to characterize their CO2 adsorption properties using the following steps: (1) optimization of the crystal structure (atomic positions and unit cell) using density functional theory, (2) fitting atomic point charges based on the electron density, (3) characterizing the pore geometry of the structures before and after optimization, (4) computing carbon dioxide and nitrogen isotherms using grand canonical Monte Carlo simulations with an empirical interaction potential, and finally, (5) assessing the CO2 parasitic energy via process modeling. The full workflow has been encoded in the Automated Interactive Infrastructure and Database for Computational Science (AiiDA). Both the workflow and the automatically generated provenance graph of our calculations are made available on the Materials Cloud, allowing peers to inspect every input parameter and result along the workflow, download structures and files at intermediate stages, and start their research right from where this work has left off. In particular, our set of CURATED (Clean, Uniform, and Refined with Automatic Tracking from Experimental Database) COFs, having optimized geometry and high-quality DFT-derived point charges, are available for further investigations of gas adsorption properties. We plan to update the database as new COFs are being reported.

LOAD NEXT 100 CITATIONS

RJR Experience and Expertise

Researcher

Robbins holds BS, MS, and PhD degrees in the life sciences. He served as a tenured faculty member in the Zoology and Biological Science departments at Michigan State University. He is currently exploring the intersection between genomics, microbial ecology, and biodiversity — an area that promises to transform our understanding of the biosphere.

Educator

Robbins has extensive experience in college-level education: At MSU he taught introductory biology, genetics, and population genetics. At JHU, he was an instructor for a special course on biological database design. At FHCRC, he team-taught a graduate-level course on the history of genetics. At Bellevue College he taught medical informatics.

Administrator

Robbins has been involved in science administration at both the federal and the institutional levels. At NSF he was a program officer for database activities in the life sciences, at DOE he was a program officer for information infrastructure in the human genome project. At the Fred Hutchinson Cancer Research Center, he served as a vice president for fifteen years.

Technologist

Robbins has been involved with information technology since writing his first Fortran program as a college student. At NSF he was the first program officer for database activities in the life sciences. At JHU he held an appointment in the CS department and served as director of the informatics core for the Genome Data Base. At the FHCRC he was VP for Information Technology.

Publisher

While still at Michigan State, Robbins started his first publishing venture, founding a small company that addressed the short-run publishing needs of instructors in very large undergraduate classes. For more than 20 years, Robbins has been operating The Electronic Scholarly Publishing Project, a web site dedicated to the digital publishing of critical works in science, especially classical genetics.

Speaker

Robbins is well-known for his speaking abilities and is often called upon to provide keynote or plenary addresses at international meetings. For example, in July, 2012, he gave a well-received keynote address at the Global Biodiversity Informatics Congress, sponsored by GBIF and held in Copenhagen. The slides from that talk can be seen HERE.

Facilitator

Robbins is a skilled meeting facilitator. He prefers a participatory approach, with part of the meeting involving dynamic breakout groups, created by the participants in real time: (1) individuals propose breakout groups; (2) everyone signs up for one (or more) groups; (3) the groups with the most interested parties then meet, with reports from each group presented and discussed in a subsequent plenary session.

Designer

Robbins has been engaged with photography and design since the 1960s, when he worked for a professional photography laboratory. He now prefers digital photography and tools for their precision and reproducibility. He designed his first web site more than 20 years ago and he personally designed and implemented this web site. He engages in graphic design as a hobby.

Order from Amazon

This is a must read book for anyone with an interest in invasion biology. The full title of the book lays out the author's premise — The New Wild: Why Invasive Species Will Be Nature's Salvation. Not only is species movement not bad for ecosystems, it is the way that ecosystems respond to perturbation — it is the way ecosystems heal. Even if you are one of those who is absolutely convinced that invasive species are actually "a blight, pollution, an epidemic, or a cancer on nature", you should read this book to clarify your own thinking. True scientific understanding never comes from just interacting with those with whom you already agree. R. Robbins

963 Red Tail Lane
Bellingham, WA 98226

206-300-3443

E-mail: RJR8222@gmail.com

Collection of publications by R J Robbins

Reprints and preprints of publications, slide presentations, instructional materials, and data compilations written or prepared by Robert Robbins. Most papers deal with computational biology, genome informatics, using information technology to support biomedical research, and related matters.

Research Gate page for R J Robbins

ResearchGate is a social networking site for scientists and researchers to share papers, ask and answer questions, and find collaborators. According to a study by Nature and an article in Times Higher Education , it is the largest academic social network in terms of active users.

Curriculum Vitae for R J Robbins

short personal version

Curriculum Vitae for R J Robbins

long standard version

RJR Picks from Around the Web (updated 11 MAY 2018 )