Theory Noninterference_CSP.CSPNoninterference

(*  Title:       Noninterference Security in Communicating Sequential Processes
    Author:      Pasquale Noce
                 Security Certification Specialist at Arjo Systems - Gep S.p.A.
                 pasquale dot noce dot lavoro at gmail dot com
                 pasquale dot noce at arjowiggins-it dot com
*)

section "Noninterference in CSP"

theory CSPNoninterference
imports Main
begin

text ‹
\null

An extension of classical noninterference security for deterministic state machines, as introduced
by Goguen and Meseguer cite"R2" and elegantly formalized by Rushby cite"R3", to nondeterministic
systems should satisfy two fundamental requirements: it should be based on a mathematically precise
theory of nondeterminism, and should be equivalent to (or at least not weaker than) the classical
notion in the degenerate deterministic case.

The purpose of this section is to formulate a definition of noninterference security that meet these
requirements, applying to the concept of process as formalized by Hoare in his remarkable theory of
Communicating Sequential Processes (CSP) cite"R1". The general case of a possibly intransitive
noninterference policy will be considered.

Throughout this paper, the salient points of definitions and proofs are commented; for additional
information see Isabelle documentation, particularly cite"R5", cite"R6", cite"R7", and cite"R8".
›


subsection "Processes"

text ‹
It is convenient to represent CSP processes by means of a type definition including a type variable,
which stands for the process alphabet. Type process› shall then be isomorphic to the subset
of the product type of failures sets and divergences sets comprised of the pairs that satisfy the
properties enunciated in cite"R1", section 3.9. Such subset shall be shown to contain process
\emph{STOP}, which proves that it is nonempty.

Property \emph{C5} is not considered as it is entailed by \emph{C7}. Moreover, the formalization of
properties \emph{C2} and \emph{C6} only takes into account event lists \emph{t} containing a single
item. Such formulation is equivalent to the original one, since the truth of \emph{C2} and \emph{C6}
for a singleton list \emph{t} immediately derives from that for a generic list, and conversely:

\begin{itemize}

\item
the truth of \emph{C2} and \emph{C6} for a generic nonempty list \emph{t} results from the repeated
application of \emph{C2} and \emph{C6} for a singleton list;

\item
the truth of \emph{C2} for \emph{t} matching the empty list is implied by property \emph{C3};

\item
the truth of \emph{C6} for \emph{t} matching the empty list is a tautology.

\end{itemize}

The advantage of the proposed formulation is that it facilitates the task to prove that pairs of
failures and divergences sets defined inductively indeed be processes, viz. be included in the set
of pairs isomorphic to type process›, since the introduction rules in such inductive
definitions will typically construct process traces by appending one item at a time.

In what follows, the concept of process is formalized according to the previous considerations.

\null
›

type_synonym 'a failure = "'a list × 'a set"

type_synonym 'a process_prod = "'a failure set × 'a list set"

definition process_prop_1 :: "'a process_prod  bool" where
"process_prop_1 P  ([], {})  fst P"

definition process_prop_2 :: "'a process_prod  bool" where
"process_prop_2 P  xs x X. (xs @ [x], X)  fst P  (xs, {})  fst P"

definition process_prop_3 :: "'a process_prod  bool" where
"process_prop_3 P  xs X Y. (xs, Y)  fst P  X  Y  (xs, X)  fst P"

definition process_prop_4 :: "'a process_prod  bool" where
"process_prop_4 P  xs x X. (xs, X)  fst P 
  (xs @ [x], {})  fst P  (xs, insert x X)  fst P"

definition process_prop_5 :: "'a process_prod  bool" where
"process_prop_5 P  xs x. xs  snd P  xs @ [x]  snd P"

definition process_prop_6 :: "'a process_prod  bool" where
"process_prop_6 P  xs X. xs  snd P  (xs, X)  fst P"

definition process_set :: "'a process_prod set" where
"process_set  {P.
  process_prop_1 P 
  process_prop_2 P 
  process_prop_3 P 
  process_prop_4 P 
  process_prop_5 P 
  process_prop_6 P}"

typedef 'a process = "process_set :: 'a process_prod set"
by (rule_tac x = "({(xs, X). xs = []}, {})" in exI, simp add:
  process_set_def
  process_prop_1_def
  process_prop_2_def
  process_prop_3_def
  process_prop_4_def
  process_prop_5_def
  process_prop_6_def)

text ‹
\null

Here below are the definitions of some functions acting on processes. Functions failures›,
traces›, and deterministic› match the homonymous notions defined in cite"R1". As for
the other ones:

\begin{itemize}

\item
futures P xs› matches the failures set of process \emph{P / xs};

\item
refusals P xs› matches the refusals set of process \emph{P / xs};

\item
next_events P xs› matches the event set (\emph{P / xs})\textsuperscript{0}.

\end{itemize}

\null
›

definition failures :: "'a process  'a failure set" where
"failures P  fst (Rep_process P)"

definition futures :: "'a process  'a list  'a failure set" where
"futures P xs  {(ys, Y). (xs @ ys, Y)  failures P}"

definition traces :: "'a process  'a list set" where
"traces P  Domain (failures P)"

definition refusals :: "'a process  'a list  'a set set" where
"refusals P xs  failures P `` {xs}"

definition next_events :: "'a process  'a list  'a set" where
"next_events P xs  {x. xs @ [x]  traces P}"

definition deterministic :: "'a process  bool" where
"deterministic P 
  xs  traces P. X. X  refusals P xs = (X  next_events P xs = {})"

text ‹
\null

In what follows, properties process_prop_2› and process_prop_3› of processes are put
into the form of introduction rules, which will turn out to be useful in subsequent proofs.
Particularly, the more general formulation of process_prop_2› as given in cite"R1" (section
3.9, property \emph{C2}) is restored, and it is expressed in terms of both functions
failures› and futures›.

\null
›

lemma process_rule_2: "(xs @ [x], X)  failures P  (xs, {})  failures P"
proof (simp add: failures_def)
  have "Rep_process P  process_set" (is "?P'  _") by (rule Rep_process)
  hence "xs x X. (xs @ [x], X)  fst ?P'  (xs, {})  fst ?P'"
   by (simp add: process_set_def process_prop_2_def)
  thus "(xs @ [x], X)  fst ?P'  (xs, {})  fst ?P'" by blast
qed

lemma process_rule_3: "(xs, Y)  failures P  X  Y  (xs, X)  failures P"
proof (simp add: failures_def)
  have "Rep_process P  process_set" (is "?P'  _") by (rule Rep_process)
  hence "xs X Y. (xs, Y)  fst ?P'  X  Y  (xs, X)  fst ?P'"
   by (simp add: process_set_def process_prop_3_def)
  thus "(xs, Y)  fst ?P'  X  Y  (xs, X)  fst ?P'" by blast
qed

lemma process_rule_2_failures [rule_format]:
 "(xs @ xs', X)  failures P  (xs, {})  failures P"
proof (induction xs' arbitrary: X rule: rev_induct, rule_tac [!] impI, simp)
  fix X
  assume "(xs, X)  failures P"
  moreover have "{}  X" ..
  ultimately show "(xs, {})  failures P" by (rule process_rule_3)
next
  fix x xs' X
  assume "X. (xs @ xs', X)  failures P  (xs, {})  failures P"
  hence "(xs @ xs', {})  failures P  (xs, {})  failures P" .
  moreover assume "(xs @ xs' @ [x], X)  failures P"
  hence "((xs @ xs') @ [x], X)  failures P" by simp
  hence "(xs @ xs', {})  failures P" by (rule process_rule_2)
  ultimately show "(xs, {})  failures P" ..
qed

lemma process_rule_2_futures:
 "(ys @ ys', Y)  futures P xs  (ys, {})  futures P xs"
by (simp add: futures_def, simp only: append_assoc [symmetric], rule process_rule_2_failures)


subsection "Noninterference"

text ‹
In the classical theory of noninterference, a deterministic state machine is considered to be secure
just in case, for any trace of the machine and any action occurring next, the observable effect of
the action, i.e. the produced output, is compatible with the assigned noninterference policy.

Thus, by analogy, it seems reasonable to regard a process as being noninterference-secure just in
case, for any of its traces and any event occurring next, the observable effect of the event, i.e.
the set of the possible futures of the process, is compatible with a given noninterference policy.

More precisely, let sinks I D u xs› be the set of the security domains of the events within
event list xs› that may be affected by domain u› according to interference relation
I›, where D› is the mapping of events into their domains. Since the general case of a
possibly intransitive relation I› is considered, function sinks› has to be defined
recursively, similarly to what happens for function \emph{sources} in cite"R3". However,
contrariwise to function \emph{sources}, function sinks› takes into account the influence of
the input domain on the input event list, so that the recursive decomposition of the latter has to
be performed by item appending rather than prepending.

Furthermore, let ipurge_tr I D u xs› be the sublist of event list xs› obtained by
recursively deleting the events that may be affected by domain u› as detected via function
sinks›, and ipurge_ref I D u xs X› be the subset of refusal X› whose elements
may not be affected by either u› or any domain in sinks I D u xs›.

Then, a process P› is secure just in case, for each event list xs› and each
(y # ys, Y), (zs, Z) ∈ futures P xs›, both of the following conditions are satisfied:

\begin{itemize}

\item
(ipurge_tr I D (D y) ys, ipurge_ref I D (D y) ys Y) ∈ futures P xs›.
\\Otherwise, the absence of event y› after xs› would affect the possibility for pair
(ipurge_tr I D (D y) ys, ipurge_ref I D (D y) ys Y)› to occur as a future of xs›,
although its components, except for the deletion of y›, are those of possible future
(y # ys, Y)› deprived of any event allowed to be affected by y›.

\item
(y # ipurge_tr I D (D y) zs, ipurge_ref I D (D y) zs Z)›
\\∈ futures P xs›.
\\Otherwise, the presence of event y› after xs› would affect the possibility for pair
(y # ipurge_tr I D (D y) zs, ipurge_ref I D (D y) zs Z)› to occur as a future of
xs›, although its components, except for the addition of y›, are those of possible
future (zs, Z)› deprived of any event allowed to be affected by y›.

\end{itemize}

Observe that this definition of security, henceforth referred to as
\emph{CSP noninterference security}, does not rest on the supposition that noninterference policy
I› be reflexive, even though any policy of practical significance will be such.

Moreover, this simpler formulation is equivalent to the one obtained by restricting the range of
event list xs› to the traces of process P›. In fact, for each zs›, Z›,
(zs, Z) ∈ futures P xs› just in case (xs @ zs, Z) ∈ failures P›, which by virtue
of rule process_rule_2_failures› implies that xs› is a trace of P›. Therefore,
formula (zs, Z) ∈ futures P xs› is invariably false in case xs› is not a trace of
P›.

Here below are the formal counterparts of the definitions discussed so far.

\null
›

function sinks :: "('d × 'd) set  ('a  'd)  'd  'a list  'd set" where
"sinks _ _ _ [] = {}" |
"sinks I D u (xs @ [x]) = (if (u, D x)  I  (v  sinks I D u xs. (v, D x)  I)
  then insert (D x) (sinks I D u xs)
  else sinks I D u xs)"
proof (atomize_elim, simp_all add: split_paired_all)
qed (rule rev_cases, rule disjI1, assumption, simp)
termination by lexicographic_order

function ipurge_tr :: "('d × 'd) set  ('a  'd)  'd  'a list  'a list" where
"ipurge_tr _ _ _ [] = []" |
"ipurge_tr I D u (xs @ [x]) = (if D x  sinks I D u (xs @ [x])
  then ipurge_tr I D u xs
  else ipurge_tr I D u xs @ [x])"
proof (atomize_elim, simp_all add: split_paired_all)
qed (rule rev_cases, rule disjI1, assumption, simp)
termination by lexicographic_order

definition ipurge_ref ::
 "('d × 'd) set  ('a  'd)  'd  'a list  'a set  'a set" where
"ipurge_ref I D u xs X 
  {x  X. (u, D x)  I  (v  sinks I D u xs. (v, D x)  I)}"

definition secure :: "'a process  ('d × 'd) set  ('a  'd)  bool" where
"secure P I D 
  xs y ys Y zs Z. (y # ys, Y)  futures P xs  (zs, Z)  futures P xs 
  (ipurge_tr I D (D y) ys, ipurge_ref I D (D y) ys Y)  futures P xs 
  (y # ipurge_tr I D (D y) zs, ipurge_ref I D (D y) zs Z)  futures P xs"

text ‹
\null

The continuation of this section is dedicated to the demonstration of some lemmas concerning
functions sinks›, ipurge_tr›, and ipurge_ref› which will turn out to be useful
in subsequent proofs.

\null
›

lemma sinks_cons_same:
  assumes R: "refl I"
  shows "sinks I D (D x) (x # xs) = insert (D x) (sinks I D (D x) xs)"
proof (rule rev_induct, simp)
  have A: "[x] = [] @ [x]" by simp
  have "sinks I D (D x) [x] = (if (D x, D x)  I  (v  {}. (v, D x)  I)
    then insert (D x) {}
    else {})"
   by (subst A, simp only: sinks.simps)
  moreover have "(D x, D x)  I" using R by (simp add: refl_on_def)
  ultimately show "sinks I D (D x) [x] = {D x}" by simp
next
  fix x' xs
  assume A: "sinks I D (D x) (x # xs) = insert (D x) (sinks I D (D x) xs)"
  show "sinks I D (D x) (x # xs @ [x']) =
    insert (D x) (sinks I D (D x) (xs @ [x']))"
  proof (cases "(D x, D x')  I  (v  sinks I D (D x) xs. (v, D x')  I)",
   simp_all (no_asm_simp))
    case True
    hence "(D x, D x')  I  (v  sinks I D (D x) (x # xs). (v, D x')  I)"
     using A by simp
    hence "sinks I D (D x) ((x # xs) @ [x']) =
      insert (D x') (sinks I D (D x) (x # xs))"
     by (simp only: sinks.simps if_True)
    thus "sinks I D (D x) (x # xs @ [x']) =
      insert (D x) (insert (D x') (sinks I D (D x) xs))"
     using A by (simp add: insert_commute)
  next
    case False
    hence "¬ ((D x, D x')  I  (v  sinks I D (D x) (x # xs). (v, D x')  I))"
     using A by simp
    hence "sinks I D (D x) ((x # xs) @ [x']) = sinks I D (D x) (x # xs)"
     by (simp only: sinks.simps if_False)
    thus "sinks I D (D x) (x # xs @ [x']) = insert (D x) (sinks I D (D x) xs)"
     using A by simp
  qed
qed

lemma ipurge_tr_cons_same:
  assumes R: "refl I"
  shows "ipurge_tr I D (D x) (x # xs) = ipurge_tr I D (D x) xs"
proof (induction xs rule: rev_induct, simp)
  have A: "[x] = [] @ [x]" by simp
  have "ipurge_tr I D (D x) [x] = (if D x  sinks I D (D x) ([] @ [x])
    then []
    else [] @ [x])"
   by (subst A, simp only: ipurge_tr.simps)
  moreover have "sinks I D (D x) [x] = {D x}"
   using R by (simp add: sinks_cons_same)
  ultimately show "ipurge_tr I D (D x) [x] = []" by simp
next
  fix x' xs
  assume A: "ipurge_tr I D (D x) (x # xs) = ipurge_tr I D (D x) xs"
  show "ipurge_tr I D (D x) (x # xs @ [x']) = ipurge_tr I D (D x) (xs @ [x'])"
  proof (cases "D x'  sinks I D (D x) (x # xs @ [x'])")
    assume B: "D x'  sinks I D (D x) (x # xs @ [x'])"
    hence "D x'  sinks I D (D x) ((x # xs) @ [x'])" by simp
    hence "ipurge_tr I D (D x) ((x # xs) @ [x']) = ipurge_tr I D (D x) (x # xs)"
     by (simp only: ipurge_tr.simps if_True)
    hence C: "ipurge_tr I D (D x) (x # xs @ [x']) = ipurge_tr I D (D x) xs"
     using A by simp
    have "D x' = D x  D x'  sinks I D (D x) (xs @ [x'])"
     using R and B by (simp add: sinks_cons_same)
    moreover {
      assume "D x' = D x"
      hence "(D x, D x')  I" using R by (simp add: refl_on_def)
      hence "ipurge_tr I D (D x) (xs @ [x']) = ipurge_tr I D (D x) xs" by simp
    }
    moreover {
      assume "D x'  sinks I D (D x) (xs @ [x'])"
      hence "ipurge_tr I D (D x) (xs @ [x']) = ipurge_tr I D (D x) xs" by simp
    }
    ultimately have D: "ipurge_tr I D (D x) (xs @ [x']) = ipurge_tr I D (D x) xs"
     by blast
    show ?thesis using C and D by simp
  next
    assume B: "D x'  sinks I D (D x) (x # xs @ [x'])"
    hence "D x'  sinks I D (D x) ((x # xs) @ [x'])" by simp
    hence "ipurge_tr I D (D x) ((x # xs) @ [x']) =
      ipurge_tr I D (D x) (x # xs) @ [x']"
     by (simp only: ipurge_tr.simps if_False)
    hence "ipurge_tr I D (D x) (x # xs @ [x']) = ipurge_tr I D (D x) xs @ [x']"
     using A by simp
    moreover have "¬ (D x' = D x  D x'  sinks I D (D x) (xs @ [x']))"
     using R and B by (simp add: sinks_cons_same)
    hence "ipurge_tr I D (D x) (xs @ [x']) = ipurge_tr I D (D x) xs @ [x']"
     by simp
    ultimately show ?thesis by simp
  qed
qed

lemma sinks_cons_nonint:
  assumes A: "(u, D x)  I"
  shows "sinks I D u (x # xs) = sinks I D u xs"
proof (rule rev_induct, simp)
  have "sinks I D u [x] = sinks I D u ([] @ [x])" by simp
  hence "sinks I D u [x] = (if (u, D x)  I  (v  {}. (v, D x)  I)
    then insert (D x) {}
    else {})"
   by (simp only: sinks.simps)
  thus "sinks I D u [x] = {}" using A by simp
next
  fix xs x'
  assume B: "sinks I D u (x # xs) = sinks I D u xs" (is "?d' = ?d")
  have "x # xs @ [x'] = (x # xs) @ [x']" by simp
  hence C: "sinks I D u (x # xs @ [x']) =
    (if (u, D x')  I  (v  ?d'. (v, D x')  I)
    then insert (D x') ?d'
    else ?d')"
    by (simp only: sinks.simps)
  show "sinks I D u (x # xs @ [x']) = sinks I D u (xs @ [x'])"
  proof (cases "(u, D x')  I  (v  ?d. (v, D x')  I)")
    case True
    with B and C have "sinks I D u (x # xs @ [x']) = insert (D x') ?d"
      by simp
    with True show ?thesis by simp
  next
    case False
    with B and C have "sinks I D u (x # xs @ [x']) = ?d" by simp
    with False show ?thesis by simp
  qed
qed

lemma sinks_empty [rule_format]:
 "sinks I D u xs = {}  ipurge_tr I D u xs = xs"
proof (rule rev_induct, simp, rule impI)
  fix x xs
  assume A: "sinks I D u (xs @ [x]) = {}"
  moreover have "sinks I D u xs  sinks I D u (xs @ [x])"
   by (simp add: subset_insertI)
  ultimately have "sinks I D u xs = {}" by simp
  moreover assume "sinks I D u xs = {}  ipurge_tr I D u xs = xs"
  ultimately have "ipurge_tr I D u xs = xs" by (rule rev_mp)
  thus "ipurge_tr I D u (xs @ [x]) = xs @ [x]" using A by simp
qed

lemma ipurge_ref_eq:
  assumes A: "D x  sinks I D u (xs @ [x])"
  shows "ipurge_ref I D u (xs @ [x]) X =
    ipurge_ref I D u xs {x'  X. (D x, D x')  I}"
proof (rule equalityI, rule_tac [!] subsetI, simp_all add: ipurge_ref_def del: sinks.simps,
 (erule conjE)+, (erule_tac [2] conjE)+)
  fix y
  assume B: "v  sinks I D u (xs @ [x]). (v, D y)  I"
  show "(D x, D y)  I  (v  sinks I D u xs. (v, D y)  I)"
  proof (rule conjI, rule_tac [2] ballI)
    show "(D x, D y)  I" using B and A ..
  next
    fix v
    assume "v  sinks I D u xs"
    hence "v  sinks I D u (xs @ [x])" by simp
    with B show "(v, D y)  I" ..
  qed
next
  fix y
  assume
    B: "(D x, D y)  I" and
    C: "v  sinks I D u xs. (v, D y)  I"
  show "v  sinks I D u (xs @ [x]). (v, D y)  I"
  proof (rule ballI, cases "(u, D x)  I  (v  sinks I D u xs. (v, D x)  I)")
    fix v
    case True
    moreover assume "v  sinks I D u (xs @ [x])"
    ultimately have "v = D x  v  sinks I D u xs" by simp
    moreover {
      assume "v = D x"
      with B have "(v, D y)  I" by simp
    }
    moreover {
      assume "v  sinks I D u xs"
      with C have "(v, D y)  I" ..
    }
    ultimately show "(v, D y)  I" by blast
  next
    fix v
    case False
    moreover assume "v  sinks I D u (xs @ [x])"
    ultimately have "v  sinks I D u xs" by simp
    with C show "(v, D y)  I" ..
  qed
qed

end