Modeling parallel execution policies of web services

Mai Xuan Trang, Yohei Murakami, Toru Ishida

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Cloud computing and high performance computing enable service providers to support parallel execution of provided services. Consider a client who invokes a web service to process a large dataset. The input data is split into independent partitions and multiple partitions are sent to the service concurrently. A typical customer would expect the service speedup to be directly proportional to the number of concurrent requests (or the degree of parallelism - DOP). However, we obtained that the achieved speedup is not always directly proportional to the DOP. This may because service providers employ parallel execution policies for their services based on arbitrary decisions. The goal of this paper is to analyse the performance improvement behavior of web services under parallel execution. We introduce a model of parallel execution policy of web services with three policies: Slow-down, Restriction and Penalty policies. We conduct analyses to evaluate our model. Interestingly, the results show that our model have a good accuracy in capturing parallel execution behavior of web services.

Original languageEnglish
Title of host publicationCloud Computing - 6th International Conference, CloudComp 2015
EditorsYin Zhang, Chan-Hyun Youn, Limei Peng
PublisherSpringer-Verlag
Pages244-254
Number of pages11
ISBN (Print)9783319389035
DOIs
Publication statusPublished - 2016 Jan 1
Externally publishedYes
Event6th International Conference on Cloud Computing, CloudComp 2015 - Daejeon, Korea, Republic of
Duration: 2015 Oct 282015 Oct 29

Publication series

NameLecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST
Volume167
ISSN (Print)1867-8211

Conference

Conference6th International Conference on Cloud Computing, CloudComp 2015
CountryKorea, Republic of
CityDaejeon
Period15/10/2815/10/29

Fingerprint

Web services
Cloud computing

Keywords

  • Parallel execution
  • Performance analysis
  • Service policy

ASJC Scopus subject areas

  • Computer Networks and Communications

Cite this

Trang, M. X., Murakami, Y., & Ishida, T. (2016). Modeling parallel execution policies of web services. In Y. Zhang, C-H. Youn, & L. Peng (Eds.), Cloud Computing - 6th International Conference, CloudComp 2015 (pp. 244-254). (Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST; Vol. 167). Springer-Verlag. https://doi.org/10.1007/978-3-319-38904-2_25

Modeling parallel execution policies of web services. / Trang, Mai Xuan; Murakami, Yohei; Ishida, Toru.

Cloud Computing - 6th International Conference, CloudComp 2015. ed. / Yin Zhang; Chan-Hyun Youn; Limei Peng. Springer-Verlag, 2016. p. 244-254 (Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST; Vol. 167).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Trang, MX, Murakami, Y & Ishida, T 2016, Modeling parallel execution policies of web services. in Y Zhang, C-H Youn & L Peng (eds), Cloud Computing - 6th International Conference, CloudComp 2015. Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST, vol. 167, Springer-Verlag, pp. 244-254, 6th International Conference on Cloud Computing, CloudComp 2015, Daejeon, Korea, Republic of, 15/10/28. https://doi.org/10.1007/978-3-319-38904-2_25
Trang MX, Murakami Y, Ishida T. Modeling parallel execution policies of web services. In Zhang Y, Youn C-H, Peng L, editors, Cloud Computing - 6th International Conference, CloudComp 2015. Springer-Verlag. 2016. p. 244-254. (Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST). https://doi.org/10.1007/978-3-319-38904-2_25
Trang, Mai Xuan ; Murakami, Yohei ; Ishida, Toru. / Modeling parallel execution policies of web services. Cloud Computing - 6th International Conference, CloudComp 2015. editor / Yin Zhang ; Chan-Hyun Youn ; Limei Peng. Springer-Verlag, 2016. pp. 244-254 (Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST).
@inproceedings{0b41e0fcbd5840e6b248ce4b801c91fb,
title = "Modeling parallel execution policies of web services",
abstract = "Cloud computing and high performance computing enable service providers to support parallel execution of provided services. Consider a client who invokes a web service to process a large dataset. The input data is split into independent partitions and multiple partitions are sent to the service concurrently. A typical customer would expect the service speedup to be directly proportional to the number of concurrent requests (or the degree of parallelism - DOP). However, we obtained that the achieved speedup is not always directly proportional to the DOP. This may because service providers employ parallel execution policies for their services based on arbitrary decisions. The goal of this paper is to analyse the performance improvement behavior of web services under parallel execution. We introduce a model of parallel execution policy of web services with three policies: Slow-down, Restriction and Penalty policies. We conduct analyses to evaluate our model. Interestingly, the results show that our model have a good accuracy in capturing parallel execution behavior of web services.",
keywords = "Parallel execution, Performance analysis, Service policy",
author = "Trang, {Mai Xuan} and Yohei Murakami and Toru Ishida",
year = "2016",
month = "1",
day = "1",
doi = "10.1007/978-3-319-38904-2_25",
language = "English",
isbn = "9783319389035",
series = "Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST",
publisher = "Springer-Verlag",
pages = "244--254",
editor = "Yin Zhang and Chan-Hyun Youn and Limei Peng",
booktitle = "Cloud Computing - 6th International Conference, CloudComp 2015",

}

TY - GEN

T1 - Modeling parallel execution policies of web services

AU - Trang, Mai Xuan

AU - Murakami, Yohei

AU - Ishida, Toru

PY - 2016/1/1

Y1 - 2016/1/1

N2 - Cloud computing and high performance computing enable service providers to support parallel execution of provided services. Consider a client who invokes a web service to process a large dataset. The input data is split into independent partitions and multiple partitions are sent to the service concurrently. A typical customer would expect the service speedup to be directly proportional to the number of concurrent requests (or the degree of parallelism - DOP). However, we obtained that the achieved speedup is not always directly proportional to the DOP. This may because service providers employ parallel execution policies for their services based on arbitrary decisions. The goal of this paper is to analyse the performance improvement behavior of web services under parallel execution. We introduce a model of parallel execution policy of web services with three policies: Slow-down, Restriction and Penalty policies. We conduct analyses to evaluate our model. Interestingly, the results show that our model have a good accuracy in capturing parallel execution behavior of web services.

AB - Cloud computing and high performance computing enable service providers to support parallel execution of provided services. Consider a client who invokes a web service to process a large dataset. The input data is split into independent partitions and multiple partitions are sent to the service concurrently. A typical customer would expect the service speedup to be directly proportional to the number of concurrent requests (or the degree of parallelism - DOP). However, we obtained that the achieved speedup is not always directly proportional to the DOP. This may because service providers employ parallel execution policies for their services based on arbitrary decisions. The goal of this paper is to analyse the performance improvement behavior of web services under parallel execution. We introduce a model of parallel execution policy of web services with three policies: Slow-down, Restriction and Penalty policies. We conduct analyses to evaluate our model. Interestingly, the results show that our model have a good accuracy in capturing parallel execution behavior of web services.

KW - Parallel execution

KW - Performance analysis

KW - Service policy

UR - http://www.scopus.com/inward/record.url?scp=84969287155&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84969287155&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-38904-2_25

DO - 10.1007/978-3-319-38904-2_25

M3 - Conference contribution

AN - SCOPUS:84969287155

SN - 9783319389035

T3 - Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST

SP - 244

EP - 254

BT - Cloud Computing - 6th International Conference, CloudComp 2015

A2 - Zhang, Yin

A2 - Youn, Chan-Hyun

A2 - Peng, Limei

PB - Springer-Verlag

ER -