Given K points in a Ndimensional (hyper)cube with all edges length 1. What is the expected minimal distance between 2 points. I found the 1dimensional case in this topic: Mean minimum distance for N random points on a onedimensional line and I wonder if this can be generalized into multiple dimensions in general. I don't seem to succeed in extending the card analogy in the other topic. Does anyone have any hints as to proceed with this?

1$\begingroup$ If you just count the average distance between two points in an $N$cube (which is some kind of continuous version of the probelm), this is known as box integrals. They are studied intensively during the last years by D.Bailey, J.Borwein, R.Crandall, and their collaborators. The $2N$dimensional integral is easy to write down but hard to evaluate by means of "known functions". The "treatable" cases (in a certain sense and as far as I know) end at dimension $N=5$. For $N=2$ the answer is compact enough, while higher dimensions involve arctangents, logs and their repeated antiderivatives... $\endgroup$– Wadim ZudilinApr 26 '10 at 13:34

$\begingroup$ If it isn't manageable to evaluate the integrals, is possible to approximate them? I also don't really understand, assuming I know the average distance between 2 points (on that subject i found this interesting paper: jstor.org/stable/1427094?seq=1) , how it's possible to extend this to the average minimal distance on k points? $\endgroup$– IngdasApr 26 '10 at 14:13

$\begingroup$ You may be interested in some of J. Philip's papers at <math.kth.se/~johanph>. $\endgroup$– villemoesJul 2 '10 at 6:59

$\begingroup$ The case $K=2$ is called hypercube line picking here: mathworld.wolfram.com/HypercubeLinePicking.html $\endgroup$– Douglas ZareMar 17 '13 at 16:10
It depends on what kind of accuracy you are looking for, but you can get a crude bound of the order of $1/K^{11/N}$ by breaking the space into K regions and applying a balls and bins argument.
The following paper:
Bhattacharyya, P., and B. K. Chakrabarti. The mean distance to the nth neighbour in a uniform distribution of random points: an application of probability theory. Eur J. Phys. 29, pp. 639645.
Claims to provide exact, approximate, and handwaving estimates for the mean 'k'th nearest neighbor distance in a uniform distribution of points over a Ddimensional Euclidean space (or a Ddimensional hypersphere of unit volume) when one ignores certain boundary conditions.
However, Wadim's response is making me feel some concern that the exact problem is much more complex. Please see the paper for the full derivation (and approximate methods), but I'll write the exact expression they converge on using two different method of absolute probability and conditional probability.
Let $D$ be the dimension of the Euclidean space, let $N$ be the number of points randomly and uniformly distributed over the space, and let $MeanDist(D, N, k)$ be the mean distance to a given points $kth$ nearestneighbor. This yields:
$MeanDist(D, N, k) = \frac{(\Gamma(\frac{D}{2}+1))^{\frac{1}{D}}}{\pi^{\frac{1}{2}}} \frac{(\Gamma(k + \frac{1}{D}))}{\Gamma(k)} \frac{\Gamma(N)}{\Gamma(N + \frac{1}{D})}$
Where $\Gamma(...)$ is the complete Gamma function.
Wadim  might it be possible for you to provide some feedback about the derivations here vs. the method of box integrals you described in your comment?
The order is $\Theta(1/K^{2/N})$.
P.S. You can derive an upper bound and a lower bound by applying covering and packing argument, respectively. Then try to do some delicate computation and approximation and you will get the result.

2

$\begingroup$ Sure. I will try to add a sketch of proof after Christmas. $\endgroup$ Dec 22 '19 at 2:19