Notes on Leakage Resilient Secret Sharing 2

All the info comes from [BDIR19] and [MPSW20]

First Attempt…

Fix a $(n,m)$ local leakage function $\vec{L}$ and let $\vec{\ell} \in \left(\set{0,1}^m\right)^n$ be a leakage value. Let $L_i^{-1} (\ell_i) \subseteq \mathbb{F}$ be the subset of $i$-th party’s shares such that the leakage function $L_i$ outputs $l_i \in \set{0,1}^m$.

$s_i \in L_i^{-1}(\ell_i) \iff L_i(s_i) = \ell_i$.

Therefore, for leakage function $\vec{L}$ we have leakage $\vec{\ell}$ if and only if the set of secret shares $\set{\vec{s}}$ belongs to the set
$$
\vec{L}^{-1}(\vec{\ell}) := L_1^{-1}(\ell_1) \times \cdots \times L_n^{-1}(\ell_n)
$$
Thus the probability of leakage being $\vec{\ell}$ conditioned on the secret being $s^{(0)}$ is
$$
\frac{1}{|\mathbb{F}|^k} \cdot \left| s^{(0)} \cdot \vec{v} + \langle G \rangle \cap \vec{L}^{-1}(\vec{\ell}) \right|.
$$
We can formulate the probability of leakage being $\vec{\ell}$ conditioned on the secret being $s^{(1)}$ similarly, and give the statistical distance
$$
\Delta(\vec{L}(s^{(0)}), \vec{L}(s^{(1)})) = \frac{1}{2} \cdot \frac{1}{|\mathbb{F}|^k} \sum_{\vec{\ell} \in \left(\set{0, 1}^m\right)^n} \Bigg| \left| s^{(0)} \cdot \vec{v} + \langle G \rangle \cap \vec{L}^{-1}(\vec{\ell}) \right| - \left| s^{(1)} \cdot \vec{v} + \langle G \rangle \cap \vec{L}^{-1}(\vec{\ell}) \right| \Bigg|.
$$

(nope, it is hard to bound…)

A rough estimation: the leakage function has a total number $\left(\left(2^m\right)^{|\mathbb{F}|}\right)^n$, a total number of $|\mathbb{F}|^{(k + 1) \times (n-k)}$ generator matrix (in standard form).

Let suppose for all $\vec{L}$ there is one $G$ such that $\Delta(\vec{L}(s^{(0)}), \vec{L}(s^{(1)})) > \varepsilon$. In this way, by a union bound there will be $\left(\left(2^m\right)^{|\mathbb{F}|}\right)^n$ $G$ matrices.

We want $\left(\left(2^m\right)^{|\mathbb{F}|}\right)^n \ll |\mathbb{F}|^{(k + 1) \times (n-k)}$, but suppose $k = n(1 - o(1))$ and $m = 1$, this is not possible.

New Set of Tests

We first recall binary entropy function $h_2: [0,1] \to [0,1]$ with
$$
h_2(p) = -p\log_2 p - (1-p)\log_2(1-p).
$$
Recall previous notions: $G^+ \in \mathbb{F}^{(k + 1) \times (n + 1)}$ of form $[I_{k + 1} | P]$ where $P\in \mathbb{F}^{(k + 1)\times (n-k)}$. The secret shares of secret 0 forms $[n,k]_\mathbb{F}$ code $\langle G \rangle$, where $G \in \mathbb{F}^{k \times (n-k)}$ has form $[I_k | R]$ and $R \in \mathbb{F}^{k \times (n-k)}$.

$H = [-R^\top | I_{n-k}]$ generates the dual code of $\langle G \rangle$.

Fix $\sigma \in [0,1], \gamma \in \mathbb{N}, a \in \mathbb{N}$ and the set of all tests $\mathsf{Test}_{\sigma,\gamma,a}$ is defined as follows:

  • every test is defined as $(\vec{V}, J)$ where $\vec{V} = (V_1,\ldots,V_n)$, each $V_i$ is a size $\gamma$ subset of $\mathbb{F}$.
  • $J$ is a size $(1-\sigma) n$ subset of $\set{1,\ldots,n}$.

A codeword $c$ fails the test indexed by $(\vec{V}, J)$ if $\forall i \in J, c_i \in V_i$.

The generator matrix $H$ fails the test indexed by $(\vec{V}, J)$ if at least $a^n$ codewords fails the test. $H$ passes $\mathsf{Test}_{\sigma,\gamma,a}$ if it fails none of the test in the set.

A rough sanity check: the total number of test, or the size of $\mathsf{Test}_{\sigma,\gamma,a}$ is
$$
\binom{|\mathbb{F}|}{\gamma}^n \cdot \binom{n}{(1-\sigma) n} = \Theta (|\mathbb{F}|^{\gamma n} \cdot 2^{h_2(\sigma) n}).
$$

Lemma 2 (Technical Lemma 2)

Fix constant $\sigma,\gamma,a$ and let $p \geq 2^{\lambda - 1}$ be a prime and $\lim_{\lambda \to \infty} \frac{n}{\lambda} \in (0,1)$ where $\lambda$ is security parameter.

Let $G^+$ be the generator matrix of $[n+1,k+1]_\mathbb{F}$ code in standard form, where constant $\frac{k}{n}\in (\sigma, 1)$.

Consider a MSS corresponding to $\langle G^+\rangle$. Let $\langle G \rangle$ be $[n,k]_\mathbb{F}$ code and $\langle H\rangle$ be the $[n,k^\bot]_\mathbb{F}$ code that is the dual code of $\langle G \rangle$, where $k^\bot = n - k$.

Then the following holds:
$$
\Pr_{G \overset{R}{\gets} \set{G^+}} \left[ H\ \mathrm{ is }\ \mathrm{MDS} \wedge H\ \mathrm{passes}\ \mathsf{Test}_{\sigma,\gamma,a} \right] = 1 - 2^{-(\lambda - n)} - \exp(- \Theta(\lambda^3)).
$$
For the probability of $H$ being MDS, we can use the proof from previous note, then we can just assume that $\langle G \rangle, \langle H \rangle$ are MDS.

WLOG we let $J = \set{\sigma n + 1,\sigma n + 2,\ldots,n}$, and we additionally set $J’ = \set{k+1,\ldots,n}$ as the information set of $\langle H \rangle$. Fix a set of witnesses $B \subseteq \mathbb{F}^{k^\bot}$ with size $a^n$.

The object of this setup is, check the probability of a codeword $c$ such that: $c_J \in V_J$ conditioned on $c_{J’} \in B$, or
$$
\Pr\left[c_J \in V_J \mid c_{J’} \in B \right],
$$
see what is the probability of $c$ subjected to information set $J’$ in witness set $B$ fails the test $(\vec{V}, J)$.

We start with a sanity check:

  • For $\vec{V}$, we have at most $\binom{|\mathbb{F}|}{\gamma}^n = \Theta(|\mathbb{F}|^{\gamma n})$.
  • For $J$, we have at most $\binom{n}{(1-\sigma)n} \le 2^{h_2(\sigma)n}$.
  • For witness $B$, for $i \in \set{k+1, \ldots,n}$, we have each $\gamma$ choices, and at most $|B| = a^n$, thus $\binom{\gamma^{k^\bot}}{a^n}$.

Thus the total number of cases $(\vec{V},J)$ and $B$ is
$$
\Theta(|\mathbb{F}|^{\gamma n} \cdot 2^{h_2(\sigma) \cdot n} \cdot \binom{\gamma^{k^\bot}}{a^n}).
$$
Fix column index $j \in J \setminus J’ = \set{\sigma n + 1,\ldots,k}$ and pick non-zero witness $\vec{d^{(1)}} \in B$. By the randomness of choosing $H_{\ast,j}$, the random variable $\vec{d^{(1)}} \cdot H_{\ast, j}$ is uniformly random over $\mathbb{F}$.

The probability of $\vec{d^{(1)}} \cdot H_{\ast,j}$ in $V_j$ is $\frac{\gamma}{|\mathbb{F}|}$, which applies to $\forall j \in J \setminus J’$. For all $j \in J \setminus J’$, the probability of $j$-th coordinate of the codeword $\vec{d^{(1)}}\cdot H$ being in $V_j$ is
$$
\left(\frac{\gamma}{|\mathbb{F}|}\right)^{k - \sigma n} = \left(\frac{\gamma}{|\mathbb{F}|}\right)^{(1-\sigma)n - k^\bot}.
$$
Consider a second witness $\vec{d^{(2)}} \in B$. If $\vec{d^{(2)}}$ is not in the span of $\vec{d^{(1)}}$, then the random variable $\vec{d^{(2)}} \cdot H_{\ast, j}$ is independent of $\vec{d^{(1)}} \cdot H_{\ast, j}$ over $\mathbb{F}$. Probability of $j$-th coordinate of $\vec{d^{(2)}}\cdot H$ being in $V_j$ is
$$
\left(\frac{\gamma}{|\mathbb{F}|}\right)^{k - \sigma n} = \left(\frac{\gamma}{|\mathbb{F}|}\right)^{(1-\sigma)n - k^\bot}.
$$
Thus we have claim: fix any $r$ linear independent $\vec{d^{(1)}}\ldots \vec{d^{(r)}} \in \mathbb{F}^{k^{\bot}}$, and for all $j \in J \setminus J’$ over the randomness of choosing $H_{\ast, j}$, the distribution of random matrix
$$
\left( \vec{d^{(i)}}\cdot H_{\ast,j} \right)^{j \in J \setminus J’}_{i \in \set{1,\ldots,r}}
$$
is identical to the uniform randomness over $\mathbb{F}^{k^\bot \times ((1-\sigma)n - k^\bot)}$.

Thus the probability of all of the codewords corresponding to these $r$ linearly independent witnesses in $B$ fails test $(\vec{V}, J)$ is
$$
\left[ \left(\frac{\gamma}{|\mathbb{F}|}\right)^{k - \sigma n} \right]^r = \left[ \left(\frac{\gamma}{|\mathbb{F}|}\right)^{(1-\sigma)n - k^\bot}\right]^r
$$

Now, so long as the witness $\vec{d^{(i)}}$ are linearly independent, we can still have probability of $\left(\frac{\gamma}{|\mathbb{F}|}\right)^{k - \sigma n}$ falling in $\vec{V}$.

If one knows how many $\vec{d^{(i)}}$ in $B$ are linearly independent ($|B| = a^n$), then we can say something about the probability of codeword $c$ falls in $(\vec{V}, J)$.

Let $M \in \mathbb{F}^{u \times v}$ where $u = 2^{\alpha v}$ be an arbitrary matrix such that each row of the matrix is distinct. Suppose for each column $j \in \set{1,\ldots,u}$ of $M$ satisfies
$$
|\set{M_{1,j},\ldots,M_{u,j}}| \le \gamma,
$$
then $\text{rank}(M) \ge \frac{\alpha}{\log_2 \gamma} \cdot v$.

We can prove by following: consider each columns be at most $\gamma$ characters, then suppose $M$ has $\text{rank}(M)$, then take arbitrary $\text{rank}(M)$ columns, we can have at most $\gamma^{\text{rank}(M)}$ choices.

But the row limit is $u = 2^{\alpha v} \le \gamma^{\text{rank}(M)}$, or we cannot have each row distinct. Thus we have $\text{rank}(M) \ge \frac{\alpha v}{\log_2 \gamma}$.

We can construct similar $M$ with $M \in \mathbb{F}^{a^n \times k^\bot}$, where $a^n = 2^{n \log_2 a}$. Thus $r \ge \frac{\log_2 a}{\log_2 \gamma} \cdot n$, or $r = \Theta(n)$.

Thus, all the witness falls in $(\vec{V}, J)$ has probability $|\mathbb{F}|^{-\Theta(n^2)}$.

By union bound, we have the probability of a random $\langle H \rangle$ fails some test $(\vec{V}, J)$ is
$$
|\mathbb{F}|^{\gamma n - \Theta(n^2)} \cdot 2^{h_2(\sigma) \cdot n} \cdot \binom{\gamma^{k^\bot}}{a^n} \le |\mathbb{F}|^{\gamma n - \Theta(n^2)} \cdot 2^{h_2(\sigma) \cdot n} \cdot \gamma^{k^\bot \cdot a^n}.
$$
Another question arises, the probability looks large, how to narrow this down.

Recall that we choose $B$ with size $a^n$ from $V_{k + 1}\times \ldots\times V_{n}$. We are overcounting the witness, and we can set unique lexicographically smallest set $\hat{B} \subseteq B$ of $r$ linearly independent witnesses.

Thus the union bound change to
$$
\begin{aligned}
|\mathbb{F}|^{\gamma n} \cdot 2^{h_2(\sigma) \cdot n} \cdot \binom{\gamma^{k^\bot}}{r} \cdot \left[ \left( \frac{\gamma}{|\mathbb{F}|}\right)^{(1-\sigma) n - k^\bot}\right]^r
&\le |\mathbb{F}|^{\gamma n} \cdot 2^{h_2(\sigma) \cdot n} \cdot \gamma^{k^\bot \cdot r} \cdot \left[ \left( \frac{\gamma}{|\mathbb{F}|}\right)^{(1-\sigma) n - k^\bot}\right]^r\newline
&= |\mathbb{F}|^{\gamma n} \cdot 2^{h_2(\sigma) \cdot n} \cdot \left( \frac{\gamma^{(1-\sigma)n}}{|\mathbb{F}|^{(1-\sigma) n - k^\bot}}\right)^r
\end{aligned}
$$
Recall the previous setting: $k / n \in (\sigma, 1)$ and $\lim_{\lambda \to \infty} n / \lambda \in (0,1)$. The probability of second failure ($\langle H \rangle$ fails the test) is $\exp(-\Omega (\lambda))$.