Skip to main content

Questions tagged [vc-dimension]

The VC dimension (for Vapnik–Chervonenkis dimension) is a measure of the capacity (complexity, expressive power, richness, or flexibility) of a statistical classification algorithm, defined as the cardinality of the largest set of points that the algorithm can shatter.

Filter by
Sorted by
Tagged with
1 vote
0 answers
26 views

I have the following hypothesis class $\mathcal{H}_m$ parameterized by $m$ (the number of intervals): $$\mathcal{H}_m = \left\{\bigcup_{i=1}^{m} [b + i(i-1), b + i^2] : b \in \mathbb{R}\right\}$$ ...
Maximiliard's user avatar
5 votes
1 answer
165 views

I'm analyzing the function class: $$ \mathcal{F} = \left\{ (x, z, y) \mapsto \mathbb{1}\{ y \leq z\alpha + x^\top\beta \} : \alpha \in \mathbb{R}, \beta \in \mathbb{R}^d \right\}. $$ Let $\mathbb{G}_n(...
Stan's user avatar
  • 724
1 vote
1 answer
126 views

In Andrews Ng's machine learning notes (https://cs229.stanford.edu/main_notes.pdf), he introduced the following bound for the difference between generalization error and training error (see the ...
ExcitedSnail's user avatar
  • 3,090
0 votes
1 answer
477 views

I have seen in many places that the vc-dimension of H, an hypostases class which consists of rectangles parallel to the axis is 4. yet when constructing this 2D constellation of points i can't ...
Tomer Gigi's user avatar
2 votes
0 answers
77 views

In the preface to the first edition of his book The Nature of Statistical Learning Theory, Vapnik makes the following comment: Between 1960 and 1980 a revolution in statistics occurred: Fisher's ...
Alek Fröhlich's user avatar
0 votes
0 answers
58 views

in studying now about Vapnik-Chervonenkis dimension, and there is one question that I not able to solve. Let $\textrm{(X , R)}$ be a range space so that any hypergraph $\textrm{(V, F)}$ in it ...
lsu's user avatar
  • 1
1 vote
0 answers
61 views

If $m$ is the VC-dim then it means there is no configuration of $m+1$ datapoints we can shatter. But there could be configurations of $m$ datapoints we cannot shatter. Hence my confusion and my ...
user3091275's user avatar
3 votes
1 answer
1k views

I know that the VC dimension for a perceptron is 3, but what is it for a logistic regression model?
Dazckel's user avatar
  • 81
3 votes
0 answers
116 views

I started learning Advanced Machine Learning and came across a problem that stuck. I would be grateful if you could help me with some ideas or solutions: What is the maximum value of the natural even ...
Andrew's user avatar
  • 31
1 vote
0 answers
116 views

In his Statistical Learning Theory (1998), Vapnik presents the following mixture of two normal laws (p.236), in which the parameters $a$ and $\sigma$ are unknown: $$p(z;a;\sigma)=\frac{1}{2}\mathcal{N}...
demim00nde's user avatar
2 votes
0 answers
146 views

I want to calculate an upper bound on how many training points an MLP regressor can fit with ~0 error. I don't care about the test error, I want to overfit as much as possible the (few) training ...
Daniele 's user avatar
0 votes
1 answer
106 views

Let $X$ be some input domain (a measurable space). Then let $D$ be some class of probability distributions on $X\times\{0,1\}$. We will call such distributions learning tasks. We say that $D$ is ...
Jack M's user avatar
  • 479
1 vote
0 answers
114 views

I'm studying VC-dimension and sample complexity, and I'd like to understand whether I understand it correctly via the following example. Let $X = \mathbb{R}$ and $\mathcal{H} = \{ h_{\theta}(x)=\text{...
Slim Shady's user avatar
4 votes
1 answer
286 views

In more classical statistical methods like linear regression, we can quantify how well our model generalizes under certain strong assumptions. For example, we know that $\hat Y = X \hat \beta \sim \...
Winger 14's user avatar
  • 338
1 vote
0 answers
149 views

I'm learning VC-dimensions and PAC-learnability right now and I need some help. I'm answering a practice exercise question that I'm prepping for an exam. So suppose we have some domain $\mathcal{X} = \...
M. Fire's user avatar
  • 11

15 30 50 per page
1
2 3 4 5