The vanishing of sum of coefficients: symmetric polynomials
$begingroup$
Denote $pmb{X}_n=(x_1,x_2,dots,x_n)$. Consider the symmetric polynomial
$$f_n(pmb X_n)=prod_{1leq i<jleq n}(x_i+x_j).$$
Expand these in terms of elementary symmetric polynomials, say
$$f_n(pmb{X}_n)=sum_{mu}c_{mu,n}cdot e_{mu}(pmb{X}_n).$$
For example,
begin{align*} f_3&=-e_{(3)}+e_{(2,1)} \
f_5&=-e_{(5,5)}+2e_{(5,4,1)}+e_{(5,3,2)}-e_{(5,2,2,1)}-e_{(4,4,1,1)}-e_{(4,3,3)}+e_{(4,3,2,1)}.
end{align*}
QUESTION 1. Is it true that, for integers $n geq 1$, we have
$$sum_{mu}c_{mu,2n+1}=0?$$
POSTSCRIPT. Fedor's reply (to Question 1) shown below suggests to me to ask:
QUESTION 2. Is it true that, for integers $n geq 1$, we have
$$sum_{mu}c_{mu,2n}=(-1)^{binom{n}2}?$$
reference-request co.combinatorics rt.representation-theory symmetric-functions
$endgroup$
add a comment |
$begingroup$
Denote $pmb{X}_n=(x_1,x_2,dots,x_n)$. Consider the symmetric polynomial
$$f_n(pmb X_n)=prod_{1leq i<jleq n}(x_i+x_j).$$
Expand these in terms of elementary symmetric polynomials, say
$$f_n(pmb{X}_n)=sum_{mu}c_{mu,n}cdot e_{mu}(pmb{X}_n).$$
For example,
begin{align*} f_3&=-e_{(3)}+e_{(2,1)} \
f_5&=-e_{(5,5)}+2e_{(5,4,1)}+e_{(5,3,2)}-e_{(5,2,2,1)}-e_{(4,4,1,1)}-e_{(4,3,3)}+e_{(4,3,2,1)}.
end{align*}
QUESTION 1. Is it true that, for integers $n geq 1$, we have
$$sum_{mu}c_{mu,2n+1}=0?$$
POSTSCRIPT. Fedor's reply (to Question 1) shown below suggests to me to ask:
QUESTION 2. Is it true that, for integers $n geq 1$, we have
$$sum_{mu}c_{mu,2n}=(-1)^{binom{n}2}?$$
reference-request co.combinatorics rt.representation-theory symmetric-functions
$endgroup$
add a comment |
$begingroup$
Denote $pmb{X}_n=(x_1,x_2,dots,x_n)$. Consider the symmetric polynomial
$$f_n(pmb X_n)=prod_{1leq i<jleq n}(x_i+x_j).$$
Expand these in terms of elementary symmetric polynomials, say
$$f_n(pmb{X}_n)=sum_{mu}c_{mu,n}cdot e_{mu}(pmb{X}_n).$$
For example,
begin{align*} f_3&=-e_{(3)}+e_{(2,1)} \
f_5&=-e_{(5,5)}+2e_{(5,4,1)}+e_{(5,3,2)}-e_{(5,2,2,1)}-e_{(4,4,1,1)}-e_{(4,3,3)}+e_{(4,3,2,1)}.
end{align*}
QUESTION 1. Is it true that, for integers $n geq 1$, we have
$$sum_{mu}c_{mu,2n+1}=0?$$
POSTSCRIPT. Fedor's reply (to Question 1) shown below suggests to me to ask:
QUESTION 2. Is it true that, for integers $n geq 1$, we have
$$sum_{mu}c_{mu,2n}=(-1)^{binom{n}2}?$$
reference-request co.combinatorics rt.representation-theory symmetric-functions
$endgroup$
Denote $pmb{X}_n=(x_1,x_2,dots,x_n)$. Consider the symmetric polynomial
$$f_n(pmb X_n)=prod_{1leq i<jleq n}(x_i+x_j).$$
Expand these in terms of elementary symmetric polynomials, say
$$f_n(pmb{X}_n)=sum_{mu}c_{mu,n}cdot e_{mu}(pmb{X}_n).$$
For example,
begin{align*} f_3&=-e_{(3)}+e_{(2,1)} \
f_5&=-e_{(5,5)}+2e_{(5,4,1)}+e_{(5,3,2)}-e_{(5,2,2,1)}-e_{(4,4,1,1)}-e_{(4,3,3)}+e_{(4,3,2,1)}.
end{align*}
QUESTION 1. Is it true that, for integers $n geq 1$, we have
$$sum_{mu}c_{mu,2n+1}=0?$$
POSTSCRIPT. Fedor's reply (to Question 1) shown below suggests to me to ask:
QUESTION 2. Is it true that, for integers $n geq 1$, we have
$$sum_{mu}c_{mu,2n}=(-1)^{binom{n}2}?$$
reference-request co.combinatorics rt.representation-theory symmetric-functions
reference-request co.combinatorics rt.representation-theory symmetric-functions
edited 7 hours ago
T. Amdeberhan
asked 10 hours ago
T. AmdeberhanT. Amdeberhan
17.7k229131
17.7k229131
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
Choose $n$ numbers $x_1,dots,x_n$ for which all elementary symmetric polynomials are equal to 1 and substitute them to our $f_n$. We should get zero value for odd $n$. Well, what are these numbers? The roots of $x^{n}-x^{n-1}+x^{n-2}-ldots-1=(x^{n+1}-1)/(x+1)$. This polynomial indeed has two roots with sum equal to 0 when $n$ is odd.
If $n=2k$ is even, we substitute the roots $w_1,dots,w_n$ of the polynomial $f(x)=x^{2k}-x^{2k-1}+ldots+1=(x^{2k+1}+1)/(x+1)=(x-w_1)dots (x-w_n)$. Then your claim reads as $$A:=prod_{1leqslant i<jleqslant n} (w_i+w_j)=(-1)^{binom{k}2}.$$ This is done by the standard trick (and is well known itself). At first,
$$
|A|^2=prod_{i=1}^n prod_{jne i,1leqslant jleqslant n}|w_i+w_j|=2^{-n}prod_{i=1}^n prod_{j=1}^n|(-w_i)-w_j|=2^{-n}prod_{i=1}^n |f(w_i)|=\=2^{-n}prod_{i=1}^nleft|frac{(-w_i)^{2k+1}+1}{-w_i+1}right|=1,
$$
since $1+(-w_i)^{2k+1}=2$ for all $i=1,2,dots,n$ and $prod_{i=1}^n (1-w_i)=f(1)=1$.
At second, we need to find the argument of the complex number $A$. This may be done for example as follows: all pairs $w_i+w_j$ for which $w_i$ and $w_j$ are not complex conjugate are partitioned onto complex conjugate pairs. In each pair the product is positive reals. If $w_i$ and $w_j$ are complex conjugate, the sum $w_i+w_j$ is a real number whose sign is the sign of the real part of $w_i$. Therefore $A$ is the real number whose sign equals $(1)^{m/2}$, where $m$ is the number of $w$'s in the left half-plane. It is easy to see that $m/2=[k/2]$ and that $(-1)^{[k/2]}=(-1)^{k(k-1)/2}$.
$endgroup$
4
$begingroup$
An alternative to your "standard trick" is to observe that $w_i + w_j = dfrac{w_i^2 - w_j^2}{w_i - w_j}$. This yields $prodlimits_{i<j} left(w_i+w_jright) = dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$. But the $n$ numbers $-w_1^2, -w_2^2, ldots, -w_n^2$ are just a permutation of the $n$ numbers $w_1, w_2, ldots, w_n$, and thus $dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$ equals a power of $-1$ times the sign of this permutation. Both are easy to compute.
$endgroup$
– darij grinberg
7 hours ago
1
$begingroup$
Yes, this is another standard trick:) Actually possibly the shortest proof is to combine them: the absolute values equals 1 since the differences $w_i^2-w_j^2$ and $w_i-w_j$ are the same up to sign, and the sign may be obtained by looking at the argument.
$endgroup$
– Fedor Petrov
7 hours ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "504"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f324489%2fthe-vanishing-of-sum-of-coefficients-symmetric-polynomials%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Choose $n$ numbers $x_1,dots,x_n$ for which all elementary symmetric polynomials are equal to 1 and substitute them to our $f_n$. We should get zero value for odd $n$. Well, what are these numbers? The roots of $x^{n}-x^{n-1}+x^{n-2}-ldots-1=(x^{n+1}-1)/(x+1)$. This polynomial indeed has two roots with sum equal to 0 when $n$ is odd.
If $n=2k$ is even, we substitute the roots $w_1,dots,w_n$ of the polynomial $f(x)=x^{2k}-x^{2k-1}+ldots+1=(x^{2k+1}+1)/(x+1)=(x-w_1)dots (x-w_n)$. Then your claim reads as $$A:=prod_{1leqslant i<jleqslant n} (w_i+w_j)=(-1)^{binom{k}2}.$$ This is done by the standard trick (and is well known itself). At first,
$$
|A|^2=prod_{i=1}^n prod_{jne i,1leqslant jleqslant n}|w_i+w_j|=2^{-n}prod_{i=1}^n prod_{j=1}^n|(-w_i)-w_j|=2^{-n}prod_{i=1}^n |f(w_i)|=\=2^{-n}prod_{i=1}^nleft|frac{(-w_i)^{2k+1}+1}{-w_i+1}right|=1,
$$
since $1+(-w_i)^{2k+1}=2$ for all $i=1,2,dots,n$ and $prod_{i=1}^n (1-w_i)=f(1)=1$.
At second, we need to find the argument of the complex number $A$. This may be done for example as follows: all pairs $w_i+w_j$ for which $w_i$ and $w_j$ are not complex conjugate are partitioned onto complex conjugate pairs. In each pair the product is positive reals. If $w_i$ and $w_j$ are complex conjugate, the sum $w_i+w_j$ is a real number whose sign is the sign of the real part of $w_i$. Therefore $A$ is the real number whose sign equals $(1)^{m/2}$, where $m$ is the number of $w$'s in the left half-plane. It is easy to see that $m/2=[k/2]$ and that $(-1)^{[k/2]}=(-1)^{k(k-1)/2}$.
$endgroup$
4
$begingroup$
An alternative to your "standard trick" is to observe that $w_i + w_j = dfrac{w_i^2 - w_j^2}{w_i - w_j}$. This yields $prodlimits_{i<j} left(w_i+w_jright) = dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$. But the $n$ numbers $-w_1^2, -w_2^2, ldots, -w_n^2$ are just a permutation of the $n$ numbers $w_1, w_2, ldots, w_n$, and thus $dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$ equals a power of $-1$ times the sign of this permutation. Both are easy to compute.
$endgroup$
– darij grinberg
7 hours ago
1
$begingroup$
Yes, this is another standard trick:) Actually possibly the shortest proof is to combine them: the absolute values equals 1 since the differences $w_i^2-w_j^2$ and $w_i-w_j$ are the same up to sign, and the sign may be obtained by looking at the argument.
$endgroup$
– Fedor Petrov
7 hours ago
add a comment |
$begingroup$
Choose $n$ numbers $x_1,dots,x_n$ for which all elementary symmetric polynomials are equal to 1 and substitute them to our $f_n$. We should get zero value for odd $n$. Well, what are these numbers? The roots of $x^{n}-x^{n-1}+x^{n-2}-ldots-1=(x^{n+1}-1)/(x+1)$. This polynomial indeed has two roots with sum equal to 0 when $n$ is odd.
If $n=2k$ is even, we substitute the roots $w_1,dots,w_n$ of the polynomial $f(x)=x^{2k}-x^{2k-1}+ldots+1=(x^{2k+1}+1)/(x+1)=(x-w_1)dots (x-w_n)$. Then your claim reads as $$A:=prod_{1leqslant i<jleqslant n} (w_i+w_j)=(-1)^{binom{k}2}.$$ This is done by the standard trick (and is well known itself). At first,
$$
|A|^2=prod_{i=1}^n prod_{jne i,1leqslant jleqslant n}|w_i+w_j|=2^{-n}prod_{i=1}^n prod_{j=1}^n|(-w_i)-w_j|=2^{-n}prod_{i=1}^n |f(w_i)|=\=2^{-n}prod_{i=1}^nleft|frac{(-w_i)^{2k+1}+1}{-w_i+1}right|=1,
$$
since $1+(-w_i)^{2k+1}=2$ for all $i=1,2,dots,n$ and $prod_{i=1}^n (1-w_i)=f(1)=1$.
At second, we need to find the argument of the complex number $A$. This may be done for example as follows: all pairs $w_i+w_j$ for which $w_i$ and $w_j$ are not complex conjugate are partitioned onto complex conjugate pairs. In each pair the product is positive reals. If $w_i$ and $w_j$ are complex conjugate, the sum $w_i+w_j$ is a real number whose sign is the sign of the real part of $w_i$. Therefore $A$ is the real number whose sign equals $(1)^{m/2}$, where $m$ is the number of $w$'s in the left half-plane. It is easy to see that $m/2=[k/2]$ and that $(-1)^{[k/2]}=(-1)^{k(k-1)/2}$.
$endgroup$
4
$begingroup$
An alternative to your "standard trick" is to observe that $w_i + w_j = dfrac{w_i^2 - w_j^2}{w_i - w_j}$. This yields $prodlimits_{i<j} left(w_i+w_jright) = dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$. But the $n$ numbers $-w_1^2, -w_2^2, ldots, -w_n^2$ are just a permutation of the $n$ numbers $w_1, w_2, ldots, w_n$, and thus $dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$ equals a power of $-1$ times the sign of this permutation. Both are easy to compute.
$endgroup$
– darij grinberg
7 hours ago
1
$begingroup$
Yes, this is another standard trick:) Actually possibly the shortest proof is to combine them: the absolute values equals 1 since the differences $w_i^2-w_j^2$ and $w_i-w_j$ are the same up to sign, and the sign may be obtained by looking at the argument.
$endgroup$
– Fedor Petrov
7 hours ago
add a comment |
$begingroup$
Choose $n$ numbers $x_1,dots,x_n$ for which all elementary symmetric polynomials are equal to 1 and substitute them to our $f_n$. We should get zero value for odd $n$. Well, what are these numbers? The roots of $x^{n}-x^{n-1}+x^{n-2}-ldots-1=(x^{n+1}-1)/(x+1)$. This polynomial indeed has two roots with sum equal to 0 when $n$ is odd.
If $n=2k$ is even, we substitute the roots $w_1,dots,w_n$ of the polynomial $f(x)=x^{2k}-x^{2k-1}+ldots+1=(x^{2k+1}+1)/(x+1)=(x-w_1)dots (x-w_n)$. Then your claim reads as $$A:=prod_{1leqslant i<jleqslant n} (w_i+w_j)=(-1)^{binom{k}2}.$$ This is done by the standard trick (and is well known itself). At first,
$$
|A|^2=prod_{i=1}^n prod_{jne i,1leqslant jleqslant n}|w_i+w_j|=2^{-n}prod_{i=1}^n prod_{j=1}^n|(-w_i)-w_j|=2^{-n}prod_{i=1}^n |f(w_i)|=\=2^{-n}prod_{i=1}^nleft|frac{(-w_i)^{2k+1}+1}{-w_i+1}right|=1,
$$
since $1+(-w_i)^{2k+1}=2$ for all $i=1,2,dots,n$ and $prod_{i=1}^n (1-w_i)=f(1)=1$.
At second, we need to find the argument of the complex number $A$. This may be done for example as follows: all pairs $w_i+w_j$ for which $w_i$ and $w_j$ are not complex conjugate are partitioned onto complex conjugate pairs. In each pair the product is positive reals. If $w_i$ and $w_j$ are complex conjugate, the sum $w_i+w_j$ is a real number whose sign is the sign of the real part of $w_i$. Therefore $A$ is the real number whose sign equals $(1)^{m/2}$, where $m$ is the number of $w$'s in the left half-plane. It is easy to see that $m/2=[k/2]$ and that $(-1)^{[k/2]}=(-1)^{k(k-1)/2}$.
$endgroup$
Choose $n$ numbers $x_1,dots,x_n$ for which all elementary symmetric polynomials are equal to 1 and substitute them to our $f_n$. We should get zero value for odd $n$. Well, what are these numbers? The roots of $x^{n}-x^{n-1}+x^{n-2}-ldots-1=(x^{n+1}-1)/(x+1)$. This polynomial indeed has two roots with sum equal to 0 when $n$ is odd.
If $n=2k$ is even, we substitute the roots $w_1,dots,w_n$ of the polynomial $f(x)=x^{2k}-x^{2k-1}+ldots+1=(x^{2k+1}+1)/(x+1)=(x-w_1)dots (x-w_n)$. Then your claim reads as $$A:=prod_{1leqslant i<jleqslant n} (w_i+w_j)=(-1)^{binom{k}2}.$$ This is done by the standard trick (and is well known itself). At first,
$$
|A|^2=prod_{i=1}^n prod_{jne i,1leqslant jleqslant n}|w_i+w_j|=2^{-n}prod_{i=1}^n prod_{j=1}^n|(-w_i)-w_j|=2^{-n}prod_{i=1}^n |f(w_i)|=\=2^{-n}prod_{i=1}^nleft|frac{(-w_i)^{2k+1}+1}{-w_i+1}right|=1,
$$
since $1+(-w_i)^{2k+1}=2$ for all $i=1,2,dots,n$ and $prod_{i=1}^n (1-w_i)=f(1)=1$.
At second, we need to find the argument of the complex number $A$. This may be done for example as follows: all pairs $w_i+w_j$ for which $w_i$ and $w_j$ are not complex conjugate are partitioned onto complex conjugate pairs. In each pair the product is positive reals. If $w_i$ and $w_j$ are complex conjugate, the sum $w_i+w_j$ is a real number whose sign is the sign of the real part of $w_i$. Therefore $A$ is the real number whose sign equals $(1)^{m/2}$, where $m$ is the number of $w$'s in the left half-plane. It is easy to see that $m/2=[k/2]$ and that $(-1)^{[k/2]}=(-1)^{k(k-1)/2}$.
edited 8 hours ago
answered 9 hours ago
Fedor PetrovFedor Petrov
50.3k6115230
50.3k6115230
4
$begingroup$
An alternative to your "standard trick" is to observe that $w_i + w_j = dfrac{w_i^2 - w_j^2}{w_i - w_j}$. This yields $prodlimits_{i<j} left(w_i+w_jright) = dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$. But the $n$ numbers $-w_1^2, -w_2^2, ldots, -w_n^2$ are just a permutation of the $n$ numbers $w_1, w_2, ldots, w_n$, and thus $dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$ equals a power of $-1$ times the sign of this permutation. Both are easy to compute.
$endgroup$
– darij grinberg
7 hours ago
1
$begingroup$
Yes, this is another standard trick:) Actually possibly the shortest proof is to combine them: the absolute values equals 1 since the differences $w_i^2-w_j^2$ and $w_i-w_j$ are the same up to sign, and the sign may be obtained by looking at the argument.
$endgroup$
– Fedor Petrov
7 hours ago
add a comment |
4
$begingroup$
An alternative to your "standard trick" is to observe that $w_i + w_j = dfrac{w_i^2 - w_j^2}{w_i - w_j}$. This yields $prodlimits_{i<j} left(w_i+w_jright) = dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$. But the $n$ numbers $-w_1^2, -w_2^2, ldots, -w_n^2$ are just a permutation of the $n$ numbers $w_1, w_2, ldots, w_n$, and thus $dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$ equals a power of $-1$ times the sign of this permutation. Both are easy to compute.
$endgroup$
– darij grinberg
7 hours ago
1
$begingroup$
Yes, this is another standard trick:) Actually possibly the shortest proof is to combine them: the absolute values equals 1 since the differences $w_i^2-w_j^2$ and $w_i-w_j$ are the same up to sign, and the sign may be obtained by looking at the argument.
$endgroup$
– Fedor Petrov
7 hours ago
4
4
$begingroup$
An alternative to your "standard trick" is to observe that $w_i + w_j = dfrac{w_i^2 - w_j^2}{w_i - w_j}$. This yields $prodlimits_{i<j} left(w_i+w_jright) = dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$. But the $n$ numbers $-w_1^2, -w_2^2, ldots, -w_n^2$ are just a permutation of the $n$ numbers $w_1, w_2, ldots, w_n$, and thus $dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$ equals a power of $-1$ times the sign of this permutation. Both are easy to compute.
$endgroup$
– darij grinberg
7 hours ago
$begingroup$
An alternative to your "standard trick" is to observe that $w_i + w_j = dfrac{w_i^2 - w_j^2}{w_i - w_j}$. This yields $prodlimits_{i<j} left(w_i+w_jright) = dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$. But the $n$ numbers $-w_1^2, -w_2^2, ldots, -w_n^2$ are just a permutation of the $n$ numbers $w_1, w_2, ldots, w_n$, and thus $dfrac{prodlimits_{i<j}left(w_i^2 - w_j^2right)}{prodlimits_{i<j}left(w_i-w_jright)}$ equals a power of $-1$ times the sign of this permutation. Both are easy to compute.
$endgroup$
– darij grinberg
7 hours ago
1
1
$begingroup$
Yes, this is another standard trick:) Actually possibly the shortest proof is to combine them: the absolute values equals 1 since the differences $w_i^2-w_j^2$ and $w_i-w_j$ are the same up to sign, and the sign may be obtained by looking at the argument.
$endgroup$
– Fedor Petrov
7 hours ago
$begingroup$
Yes, this is another standard trick:) Actually possibly the shortest proof is to combine them: the absolute values equals 1 since the differences $w_i^2-w_j^2$ and $w_i-w_j$ are the same up to sign, and the sign may be obtained by looking at the argument.
$endgroup$
– Fedor Petrov
7 hours ago
add a comment |
Thanks for contributing an answer to MathOverflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmathoverflow.net%2fquestions%2f324489%2fthe-vanishing-of-sum-of-coefficients-symmetric-polynomials%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown