Quantitative information flow-verification hardness and possibilities

Yasuoka Hirotoshi, Tachio Terauchi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

38 Citations (Scopus)

Abstract

Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inferring quantitative information flow according to such definitions. We prove that, even for just comparing two programs on which has the larger flow, none of the definitions is a k- safety property for any k, and therefore is not amenable to the self-composition technique that has been successfully applied to precisely checking non-interference. We also show a complexity theoretic gap with non-interference by proving that, for loop-free boolean programs whose non-interference is coNP-complete, the comparison problem is #P-hard for all of the definitions. For positive results, we show that universally quantifying the distribution in the comparison problem, that is, comparing two programs according to the entropy based definitions on which has the larger flow for all distributions, is a 2-safety problem in general and is coNP-complete when restricted for loop-free boolean programs. We prove this by showing that the problem is equivalent to a simple relation naturally expressing the fact that one program is more secure than the other. We prove that the relation also refines the channel-capacity based definition, and that it can be precisely checked via the self-composition as well as the "interleaved" self-composition technique.

Original languageEnglish
Title of host publication23rd IEEE Computer Security Foundations Symposium, CSF 2010
Pages15-27
Number of pages13
DOIs
Publication statusPublished - 2010
Externally publishedYes
Event23rd Computer Security Foundations Symposium, CSF 2010 - Edinburgh, United Kingdom
Duration: 2010 Jul 172010 Jul 19

Other

Other23rd Computer Security Foundations Symposium, CSF 2010
CountryUnited Kingdom
CityEdinburgh
Period10/7/1710/7/19

Fingerprint

Entropy
Hardness
Channel capacity
Chemical analysis

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Hirotoshi, Y., & Terauchi, T. (2010). Quantitative information flow-verification hardness and possibilities. In 23rd IEEE Computer Security Foundations Symposium, CSF 2010 (pp. 15-27). [5552655] https://doi.org/10.1109/CSF.2010.9

Quantitative information flow-verification hardness and possibilities. / Hirotoshi, Yasuoka; Terauchi, Tachio.

23rd IEEE Computer Security Foundations Symposium, CSF 2010. 2010. p. 15-27 5552655.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Hirotoshi, Y & Terauchi, T 2010, Quantitative information flow-verification hardness and possibilities. in 23rd IEEE Computer Security Foundations Symposium, CSF 2010., 5552655, pp. 15-27, 23rd Computer Security Foundations Symposium, CSF 2010, Edinburgh, United Kingdom, 10/7/17. https://doi.org/10.1109/CSF.2010.9
Hirotoshi Y, Terauchi T. Quantitative information flow-verification hardness and possibilities. In 23rd IEEE Computer Security Foundations Symposium, CSF 2010. 2010. p. 15-27. 5552655 https://doi.org/10.1109/CSF.2010.9
Hirotoshi, Yasuoka ; Terauchi, Tachio. / Quantitative information flow-verification hardness and possibilities. 23rd IEEE Computer Security Foundations Symposium, CSF 2010. 2010. pp. 15-27
@inproceedings{de195ebb89d640bb88d57d3b85e2961d,
title = "Quantitative information flow-verification hardness and possibilities",
abstract = "Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inferring quantitative information flow according to such definitions. We prove that, even for just comparing two programs on which has the larger flow, none of the definitions is a k- safety property for any k, and therefore is not amenable to the self-composition technique that has been successfully applied to precisely checking non-interference. We also show a complexity theoretic gap with non-interference by proving that, for loop-free boolean programs whose non-interference is coNP-complete, the comparison problem is #P-hard for all of the definitions. For positive results, we show that universally quantifying the distribution in the comparison problem, that is, comparing two programs according to the entropy based definitions on which has the larger flow for all distributions, is a 2-safety problem in general and is coNP-complete when restricted for loop-free boolean programs. We prove this by showing that the problem is equivalent to a simple relation naturally expressing the fact that one program is more secure than the other. We prove that the relation also refines the channel-capacity based definition, and that it can be precisely checked via the self-composition as well as the {"}interleaved{"} self-composition technique.",
author = "Yasuoka Hirotoshi and Tachio Terauchi",
year = "2010",
doi = "10.1109/CSF.2010.9",
language = "English",
isbn = "9780769540825",
pages = "15--27",
booktitle = "23rd IEEE Computer Security Foundations Symposium, CSF 2010",

}

TY - GEN

T1 - Quantitative information flow-verification hardness and possibilities

AU - Hirotoshi, Yasuoka

AU - Terauchi, Tachio

PY - 2010

Y1 - 2010

N2 - Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inferring quantitative information flow according to such definitions. We prove that, even for just comparing two programs on which has the larger flow, none of the definitions is a k- safety property for any k, and therefore is not amenable to the self-composition technique that has been successfully applied to precisely checking non-interference. We also show a complexity theoretic gap with non-interference by proving that, for loop-free boolean programs whose non-interference is coNP-complete, the comparison problem is #P-hard for all of the definitions. For positive results, we show that universally quantifying the distribution in the comparison problem, that is, comparing two programs according to the entropy based definitions on which has the larger flow for all distributions, is a 2-safety problem in general and is coNP-complete when restricted for loop-free boolean programs. We prove this by showing that the problem is equivalent to a simple relation naturally expressing the fact that one program is more secure than the other. We prove that the relation also refines the channel-capacity based definition, and that it can be precisely checked via the self-composition as well as the "interleaved" self-composition technique.

AB - Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inferring quantitative information flow according to such definitions. We prove that, even for just comparing two programs on which has the larger flow, none of the definitions is a k- safety property for any k, and therefore is not amenable to the self-composition technique that has been successfully applied to precisely checking non-interference. We also show a complexity theoretic gap with non-interference by proving that, for loop-free boolean programs whose non-interference is coNP-complete, the comparison problem is #P-hard for all of the definitions. For positive results, we show that universally quantifying the distribution in the comparison problem, that is, comparing two programs according to the entropy based definitions on which has the larger flow for all distributions, is a 2-safety problem in general and is coNP-complete when restricted for loop-free boolean programs. We prove this by showing that the problem is equivalent to a simple relation naturally expressing the fact that one program is more secure than the other. We prove that the relation also refines the channel-capacity based definition, and that it can be precisely checked via the self-composition as well as the "interleaved" self-composition technique.

UR - http://www.scopus.com/inward/record.url?scp=77957590642&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77957590642&partnerID=8YFLogxK

U2 - 10.1109/CSF.2010.9

DO - 10.1109/CSF.2010.9

M3 - Conference contribution

AN - SCOPUS:77957590642

SN - 9780769540825

SP - 15

EP - 27

BT - 23rd IEEE Computer Security Foundations Symposium, CSF 2010

ER -