Real-Time Delay Minimization for Data Processing in Wirelessly Networked Disaster Areas

Yu Wang, Michael Conrad Meyer, Junbo Wang

Research output: Contribution to journalArticle

Abstract

Fog computing is a disruptive technology in the big data analytics area. Smartphone users and organizations use cellular services, which can support decision-making in disaster scenarios with the data that have been collected. Nevertheless, the regular communication infrastructure can be damaged by disasters. NTT provided an easily deployable solution to construct an emergency communication network (ECN), but ECNs are slow at propagating big data due to their limited transmission capabilities. One major issue is efficiently integrating data processing in the ECN to realize effective data processing and transmission in disaster scenarios. In this paper, we present - a detailed mathematical model to represent data processing and transmission in an ECN fog network; an NP-hard proof for the problem of optimizing the overall delay; and a novel algorithm to minimize the overall delay for wirelessly-networked disaster areas that can be run in real-time. We evaluated the systems across various transmission speeds, processing speeds, and network sizes. We also tested the calculation time, accuracy, and percent age error of the systems. Through evaluation, we found that the proposed disaster area adaptive delay minimization (DAADM) algorithm showed to have a reduced overall delay over various network sizes when compared with some conventional solutions. The proposed DAADM algorithm matched the curve of the genetic algorithm (GA), even if its results did not yield delays as small as the GA. The DAADM had one major advantage over the GA which was the processing time, which allows the DAADM to be implemented in a real-time system, where a GA solution would take far too much time.

Original languageEnglish
Article number8571231
Pages (from-to)2928-2937
Number of pages10
JournalIEEE Access
Volume7
DOIs
Publication statusPublished - 2019 Jan 1

Fingerprint

Disasters
Time delay
Genetic algorithms
Telecommunication networks
Fog
Data communication systems
Smartphones
Processing
Real time systems
Decision making
Mathematical models
Communication

Keywords

  • Ad hoc networks
  • big data applications
  • computer network management
  • edge computing
  • mobile applications
  • real-time systems

ASJC Scopus subject areas

  • Computer Science(all)
  • Materials Science(all)
  • Engineering(all)

Cite this

Real-Time Delay Minimization for Data Processing in Wirelessly Networked Disaster Areas. / Wang, Yu; Meyer, Michael Conrad; Wang, Junbo.

In: IEEE Access, Vol. 7, 8571231, 01.01.2019, p. 2928-2937.

Research output: Contribution to journalArticle

Wang, Yu ; Meyer, Michael Conrad ; Wang, Junbo. / Real-Time Delay Minimization for Data Processing in Wirelessly Networked Disaster Areas. In: IEEE Access. 2019 ; Vol. 7. pp. 2928-2937.
@article{06ab911694534dbb8c1fa8aa3191ece0,
title = "Real-Time Delay Minimization for Data Processing in Wirelessly Networked Disaster Areas",
abstract = "Fog computing is a disruptive technology in the big data analytics area. Smartphone users and organizations use cellular services, which can support decision-making in disaster scenarios with the data that have been collected. Nevertheless, the regular communication infrastructure can be damaged by disasters. NTT provided an easily deployable solution to construct an emergency communication network (ECN), but ECNs are slow at propagating big data due to their limited transmission capabilities. One major issue is efficiently integrating data processing in the ECN to realize effective data processing and transmission in disaster scenarios. In this paper, we present - a detailed mathematical model to represent data processing and transmission in an ECN fog network; an NP-hard proof for the problem of optimizing the overall delay; and a novel algorithm to minimize the overall delay for wirelessly-networked disaster areas that can be run in real-time. We evaluated the systems across various transmission speeds, processing speeds, and network sizes. We also tested the calculation time, accuracy, and percent age error of the systems. Through evaluation, we found that the proposed disaster area adaptive delay minimization (DAADM) algorithm showed to have a reduced overall delay over various network sizes when compared with some conventional solutions. The proposed DAADM algorithm matched the curve of the genetic algorithm (GA), even if its results did not yield delays as small as the GA. The DAADM had one major advantage over the GA which was the processing time, which allows the DAADM to be implemented in a real-time system, where a GA solution would take far too much time.",
keywords = "Ad hoc networks, big data applications, computer network management, edge computing, mobile applications, real-time systems",
author = "Yu Wang and Meyer, {Michael Conrad} and Junbo Wang",
year = "2019",
month = "1",
day = "1",
doi = "10.1109/ACCESS.2018.2886075",
language = "English",
volume = "7",
pages = "2928--2937",
journal = "IEEE Access",
issn = "2169-3536",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - JOUR

T1 - Real-Time Delay Minimization for Data Processing in Wirelessly Networked Disaster Areas

AU - Wang, Yu

AU - Meyer, Michael Conrad

AU - Wang, Junbo

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Fog computing is a disruptive technology in the big data analytics area. Smartphone users and organizations use cellular services, which can support decision-making in disaster scenarios with the data that have been collected. Nevertheless, the regular communication infrastructure can be damaged by disasters. NTT provided an easily deployable solution to construct an emergency communication network (ECN), but ECNs are slow at propagating big data due to their limited transmission capabilities. One major issue is efficiently integrating data processing in the ECN to realize effective data processing and transmission in disaster scenarios. In this paper, we present - a detailed mathematical model to represent data processing and transmission in an ECN fog network; an NP-hard proof for the problem of optimizing the overall delay; and a novel algorithm to minimize the overall delay for wirelessly-networked disaster areas that can be run in real-time. We evaluated the systems across various transmission speeds, processing speeds, and network sizes. We also tested the calculation time, accuracy, and percent age error of the systems. Through evaluation, we found that the proposed disaster area adaptive delay minimization (DAADM) algorithm showed to have a reduced overall delay over various network sizes when compared with some conventional solutions. The proposed DAADM algorithm matched the curve of the genetic algorithm (GA), even if its results did not yield delays as small as the GA. The DAADM had one major advantage over the GA which was the processing time, which allows the DAADM to be implemented in a real-time system, where a GA solution would take far too much time.

AB - Fog computing is a disruptive technology in the big data analytics area. Smartphone users and organizations use cellular services, which can support decision-making in disaster scenarios with the data that have been collected. Nevertheless, the regular communication infrastructure can be damaged by disasters. NTT provided an easily deployable solution to construct an emergency communication network (ECN), but ECNs are slow at propagating big data due to their limited transmission capabilities. One major issue is efficiently integrating data processing in the ECN to realize effective data processing and transmission in disaster scenarios. In this paper, we present - a detailed mathematical model to represent data processing and transmission in an ECN fog network; an NP-hard proof for the problem of optimizing the overall delay; and a novel algorithm to minimize the overall delay for wirelessly-networked disaster areas that can be run in real-time. We evaluated the systems across various transmission speeds, processing speeds, and network sizes. We also tested the calculation time, accuracy, and percent age error of the systems. Through evaluation, we found that the proposed disaster area adaptive delay minimization (DAADM) algorithm showed to have a reduced overall delay over various network sizes when compared with some conventional solutions. The proposed DAADM algorithm matched the curve of the genetic algorithm (GA), even if its results did not yield delays as small as the GA. The DAADM had one major advantage over the GA which was the processing time, which allows the DAADM to be implemented in a real-time system, where a GA solution would take far too much time.

KW - Ad hoc networks

KW - big data applications

KW - computer network management

KW - edge computing

KW - mobile applications

KW - real-time systems

UR - http://www.scopus.com/inward/record.url?scp=85058141119&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85058141119&partnerID=8YFLogxK

U2 - 10.1109/ACCESS.2018.2886075

DO - 10.1109/ACCESS.2018.2886075

M3 - Article

AN - SCOPUS:85058141119

VL - 7

SP - 2928

EP - 2937

JO - IEEE Access

JF - IEEE Access

SN - 2169-3536

M1 - 8571231

ER -