# Matrix completion prize results

Earlier this year ARC posted a prize for two matrix completion problems. We received a number of submissions we considered useful, but not any complete solutions. We are closing the contest and awarding the following partial prizes:

- $500 to Elad Hazan for solving a related problem and pointing us to
__this paper__ - $500 to Som Bagchi and Jacob Stavrianos for their analysis in
__this comment__. - $500 to Shalev Ben-David for a reduction to computing the
__gamma 2 norm__.

Our main update from running this prize is that these problems are hard and there’s probably not a simple solution we are overlooking. My current guess is that it’s possible to achieve a polynomial dependence on the precision \(\varepsilon\), but not the logarithmic dependence we desired; even this weaker result seems like it will be challenging.

Thanks to everyone who took time to think about this problem.

### What this means for ARC

In this section I’ll try to briefly describe the relationship between these problems and heuristic estimators. I’ll use the context and notation from __this talk__. I don’t expect this discussion to be detailed enough to be meaningful to anyone who doesn’t already have a lot of context on ARC’s work, and I think most readers should wait to engage until we publish a more extensive research update next year.

One of ARC’s main activities this year has been refining our goals for heuristic estimators by finding algorithms, finding evidence for hardness, and clarifying what properties are actually needed for our desired alignment applications. This contest was part of that process.

In early 2023 ARC hoped to find an estimator \(G\) such that for any matrix \(A\) and any argument \(\pi\), the heuristic estimate \(\mathbb{G}\!\left( v^T A A^T v \mid \pi \right)\) would be a non-negative quadratic function of \(v\). The two problems we proposed are very closely related to achieving this goal in the special case where \(\pi\) computes a sparse set of \(m\) entries of \(A A^T\). We now expect that it will be algorithmically difficult to ensure that \(\mathbb{G}\!\left( v^T A A^T v \mid \pi \right)\) is a non-negative quadratic; as a result, we don't expect this property to be satisfied by the kind of natural heuristic estimator we're looking for.

We made a related update based on another result: Eric Neyman proved that unless \(\text{P}=\text{PP}\), there is no fast estimator \(\mathbb{G}\) that satisfies our other desiderata together with the property \(\mathbb{G}\!\left(f(x) \mid \pi\right) \geq 0\) whenever \(\pi\) proves that \(f(x) \geq 0\) for all \(x\). Instead, the best we can hope for is that \(\mathbb{G}\!\left(f(x) \mid \pi(x)\right) \geq 0\) whenever \(\pi(x)\) is a proof that \(f(x) \geq 0\) for a *particular value* *of *x.

We now expect to make a similar relaxation for these matrix completion problems. Rather than requiring that \(\mathbb{G}\!\left( v^T A A^T v \mid \pi \right)\) is nonnegative for *all* vectors v, we can instead require that \(\mathbb{G}\!\left(v ^TAA ^Tv | \pi, \pi(v)\right)\) is non-negative whenever \(\pi(v)\) proves that \(v ^TAA ^Tv \geq 0\) for the particular vector \(v\). We don't expect \(\mathbb{G}\!\left(v ^TAA ^Tv | \pi, \pi(v)\right)\) to be a quadratic function of \(v\) because of the appearance of \(\pi(v)\) on the right hand side.

We still expect \(\mathbb{G}\!\left( v^T A A^T v \mid \pi \right)\) to be a quadratic function in \(v\) (this follows from linearity) and therefore to correspond to some completion \(B\) of \(AA ^T\). However we no longer expect \(B\) to be PSD. Instead all we can say is that we don’t yet **know** any direction v such that \(v ^T B v < 0\). The completion \(B\) will change each time we consider a particular direction v, after which it will be guaranteed that \(v ^T B v \geq 0\).