Does pointwise convergence of estimator imply consistency












1












$begingroup$


Let $n in mathbb N$ and $Omega=mathbb N^{n}, mathcal{F}=2^{Omega},mathcal{P}:={P_{vartheta}:=operatorname{Geom}(vartheta)^{otimes n}:0<vartheta<1}$



Find the Estimator $hat{vartheta}:Omegato (0,infty)$ where $forall omega in Omega:P_{hat{vartheta}(x)}({omega})=max_{vartheta}P_{vartheta}({omega})$
using the function $f: vartheta mapsto log{(P_{vartheta}({omega}))}$



And then show that the estimator is consistent.



My idea:



$log({P_{vartheta}({omega})})=log(prod_{i=1}^{n}(1-vartheta)^{omega_{i}-1}vartheta)$



Then define $S:=sum_{i=1}^{n}omega_{i}$ and see that



$log(prod_{i=1}^{n}(1-vartheta)^{omega_{i}-1}vartheta)=log((1-vartheta)^{S-n}vartheta^n)=(S-n)log(1-vartheta)+nlog(vartheta)$



It follows that $f'(vartheta)=frac{n}{vartheta}-frac{S-n}{1-vartheta}$ and $f'(vartheta)=0 iff vartheta = frac{n}{S}$ and since $f^{''}(vartheta)<0$ the function is maximized at $vartheta = frac{n}{S}$



So our estimator $hat{vartheta}=frac{n}{S}$.



Now onto my actual problem, on showing that a estimator is consistent. My understanding of a consistent estimator $hat{vartheta}$ of $vartheta$ is that



For any $P_{vartheta} in mathcal{P}$, $hat{vartheta}xrightarrow{n to infty}vartheta(P_{vartheta})$



But how can I test whether $hat{vartheta}$ converges to a parameter if I do not know what parameter $vartheta$ is supposed to be?



Additional questions:



$1.$ Does my definition of consistent estimator: $hat{vartheta}xrightarrow{n to infty}vartheta(P)$ mean that $hat{vartheta}$ converges to $vartheta(P)$ pointwise and thereby almost everywhere?



$2.$ Since I am supposed to choose any $P_{vartheta}in mathcal{P}$, my probability measure already depends on my choice of parameter $vartheta$, so therefore I cannot choose any $P in mathcal{P}$, can I?



$3.$ Does pointwise convergence of an estimator imply cosistent estimator?










share|cite|improve this question









$endgroup$












  • $begingroup$
    Typically, consistency means convergence in probability (so you may use the WLLN for i.i.d sequences).
    $endgroup$
    – d.k.o.
    Jan 25 at 21:14










  • $begingroup$
    So if I can show that $forall omega in Omega,hat{vartheta}_{n}(omega)xrightarrow{n to infty} vartheta(P)(omega)$ for any $P in mathcal{P}$ then this would be convergence almost certainly and thereby convergence in probability?
    $endgroup$
    – MinaThuma
    Jan 25 at 23:47












  • $begingroup$
    How are you going to show pointwise convergence?
    $endgroup$
    – d.k.o.
    Jan 25 at 23:55
















1












$begingroup$


Let $n in mathbb N$ and $Omega=mathbb N^{n}, mathcal{F}=2^{Omega},mathcal{P}:={P_{vartheta}:=operatorname{Geom}(vartheta)^{otimes n}:0<vartheta<1}$



Find the Estimator $hat{vartheta}:Omegato (0,infty)$ where $forall omega in Omega:P_{hat{vartheta}(x)}({omega})=max_{vartheta}P_{vartheta}({omega})$
using the function $f: vartheta mapsto log{(P_{vartheta}({omega}))}$



And then show that the estimator is consistent.



My idea:



$log({P_{vartheta}({omega})})=log(prod_{i=1}^{n}(1-vartheta)^{omega_{i}-1}vartheta)$



Then define $S:=sum_{i=1}^{n}omega_{i}$ and see that



$log(prod_{i=1}^{n}(1-vartheta)^{omega_{i}-1}vartheta)=log((1-vartheta)^{S-n}vartheta^n)=(S-n)log(1-vartheta)+nlog(vartheta)$



It follows that $f'(vartheta)=frac{n}{vartheta}-frac{S-n}{1-vartheta}$ and $f'(vartheta)=0 iff vartheta = frac{n}{S}$ and since $f^{''}(vartheta)<0$ the function is maximized at $vartheta = frac{n}{S}$



So our estimator $hat{vartheta}=frac{n}{S}$.



Now onto my actual problem, on showing that a estimator is consistent. My understanding of a consistent estimator $hat{vartheta}$ of $vartheta$ is that



For any $P_{vartheta} in mathcal{P}$, $hat{vartheta}xrightarrow{n to infty}vartheta(P_{vartheta})$



But how can I test whether $hat{vartheta}$ converges to a parameter if I do not know what parameter $vartheta$ is supposed to be?



Additional questions:



$1.$ Does my definition of consistent estimator: $hat{vartheta}xrightarrow{n to infty}vartheta(P)$ mean that $hat{vartheta}$ converges to $vartheta(P)$ pointwise and thereby almost everywhere?



$2.$ Since I am supposed to choose any $P_{vartheta}in mathcal{P}$, my probability measure already depends on my choice of parameter $vartheta$, so therefore I cannot choose any $P in mathcal{P}$, can I?



$3.$ Does pointwise convergence of an estimator imply cosistent estimator?










share|cite|improve this question









$endgroup$












  • $begingroup$
    Typically, consistency means convergence in probability (so you may use the WLLN for i.i.d sequences).
    $endgroup$
    – d.k.o.
    Jan 25 at 21:14










  • $begingroup$
    So if I can show that $forall omega in Omega,hat{vartheta}_{n}(omega)xrightarrow{n to infty} vartheta(P)(omega)$ for any $P in mathcal{P}$ then this would be convergence almost certainly and thereby convergence in probability?
    $endgroup$
    – MinaThuma
    Jan 25 at 23:47












  • $begingroup$
    How are you going to show pointwise convergence?
    $endgroup$
    – d.k.o.
    Jan 25 at 23:55














1












1








1





$begingroup$


Let $n in mathbb N$ and $Omega=mathbb N^{n}, mathcal{F}=2^{Omega},mathcal{P}:={P_{vartheta}:=operatorname{Geom}(vartheta)^{otimes n}:0<vartheta<1}$



Find the Estimator $hat{vartheta}:Omegato (0,infty)$ where $forall omega in Omega:P_{hat{vartheta}(x)}({omega})=max_{vartheta}P_{vartheta}({omega})$
using the function $f: vartheta mapsto log{(P_{vartheta}({omega}))}$



And then show that the estimator is consistent.



My idea:



$log({P_{vartheta}({omega})})=log(prod_{i=1}^{n}(1-vartheta)^{omega_{i}-1}vartheta)$



Then define $S:=sum_{i=1}^{n}omega_{i}$ and see that



$log(prod_{i=1}^{n}(1-vartheta)^{omega_{i}-1}vartheta)=log((1-vartheta)^{S-n}vartheta^n)=(S-n)log(1-vartheta)+nlog(vartheta)$



It follows that $f'(vartheta)=frac{n}{vartheta}-frac{S-n}{1-vartheta}$ and $f'(vartheta)=0 iff vartheta = frac{n}{S}$ and since $f^{''}(vartheta)<0$ the function is maximized at $vartheta = frac{n}{S}$



So our estimator $hat{vartheta}=frac{n}{S}$.



Now onto my actual problem, on showing that a estimator is consistent. My understanding of a consistent estimator $hat{vartheta}$ of $vartheta$ is that



For any $P_{vartheta} in mathcal{P}$, $hat{vartheta}xrightarrow{n to infty}vartheta(P_{vartheta})$



But how can I test whether $hat{vartheta}$ converges to a parameter if I do not know what parameter $vartheta$ is supposed to be?



Additional questions:



$1.$ Does my definition of consistent estimator: $hat{vartheta}xrightarrow{n to infty}vartheta(P)$ mean that $hat{vartheta}$ converges to $vartheta(P)$ pointwise and thereby almost everywhere?



$2.$ Since I am supposed to choose any $P_{vartheta}in mathcal{P}$, my probability measure already depends on my choice of parameter $vartheta$, so therefore I cannot choose any $P in mathcal{P}$, can I?



$3.$ Does pointwise convergence of an estimator imply cosistent estimator?










share|cite|improve this question









$endgroup$




Let $n in mathbb N$ and $Omega=mathbb N^{n}, mathcal{F}=2^{Omega},mathcal{P}:={P_{vartheta}:=operatorname{Geom}(vartheta)^{otimes n}:0<vartheta<1}$



Find the Estimator $hat{vartheta}:Omegato (0,infty)$ where $forall omega in Omega:P_{hat{vartheta}(x)}({omega})=max_{vartheta}P_{vartheta}({omega})$
using the function $f: vartheta mapsto log{(P_{vartheta}({omega}))}$



And then show that the estimator is consistent.



My idea:



$log({P_{vartheta}({omega})})=log(prod_{i=1}^{n}(1-vartheta)^{omega_{i}-1}vartheta)$



Then define $S:=sum_{i=1}^{n}omega_{i}$ and see that



$log(prod_{i=1}^{n}(1-vartheta)^{omega_{i}-1}vartheta)=log((1-vartheta)^{S-n}vartheta^n)=(S-n)log(1-vartheta)+nlog(vartheta)$



It follows that $f'(vartheta)=frac{n}{vartheta}-frac{S-n}{1-vartheta}$ and $f'(vartheta)=0 iff vartheta = frac{n}{S}$ and since $f^{''}(vartheta)<0$ the function is maximized at $vartheta = frac{n}{S}$



So our estimator $hat{vartheta}=frac{n}{S}$.



Now onto my actual problem, on showing that a estimator is consistent. My understanding of a consistent estimator $hat{vartheta}$ of $vartheta$ is that



For any $P_{vartheta} in mathcal{P}$, $hat{vartheta}xrightarrow{n to infty}vartheta(P_{vartheta})$



But how can I test whether $hat{vartheta}$ converges to a parameter if I do not know what parameter $vartheta$ is supposed to be?



Additional questions:



$1.$ Does my definition of consistent estimator: $hat{vartheta}xrightarrow{n to infty}vartheta(P)$ mean that $hat{vartheta}$ converges to $vartheta(P)$ pointwise and thereby almost everywhere?



$2.$ Since I am supposed to choose any $P_{vartheta}in mathcal{P}$, my probability measure already depends on my choice of parameter $vartheta$, so therefore I cannot choose any $P in mathcal{P}$, can I?



$3.$ Does pointwise convergence of an estimator imply cosistent estimator?







probability probability-theory statistics random-variables estimation






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Jan 25 at 15:54









MinaThumaMinaThuma

1968




1968












  • $begingroup$
    Typically, consistency means convergence in probability (so you may use the WLLN for i.i.d sequences).
    $endgroup$
    – d.k.o.
    Jan 25 at 21:14










  • $begingroup$
    So if I can show that $forall omega in Omega,hat{vartheta}_{n}(omega)xrightarrow{n to infty} vartheta(P)(omega)$ for any $P in mathcal{P}$ then this would be convergence almost certainly and thereby convergence in probability?
    $endgroup$
    – MinaThuma
    Jan 25 at 23:47












  • $begingroup$
    How are you going to show pointwise convergence?
    $endgroup$
    – d.k.o.
    Jan 25 at 23:55


















  • $begingroup$
    Typically, consistency means convergence in probability (so you may use the WLLN for i.i.d sequences).
    $endgroup$
    – d.k.o.
    Jan 25 at 21:14










  • $begingroup$
    So if I can show that $forall omega in Omega,hat{vartheta}_{n}(omega)xrightarrow{n to infty} vartheta(P)(omega)$ for any $P in mathcal{P}$ then this would be convergence almost certainly and thereby convergence in probability?
    $endgroup$
    – MinaThuma
    Jan 25 at 23:47












  • $begingroup$
    How are you going to show pointwise convergence?
    $endgroup$
    – d.k.o.
    Jan 25 at 23:55
















$begingroup$
Typically, consistency means convergence in probability (so you may use the WLLN for i.i.d sequences).
$endgroup$
– d.k.o.
Jan 25 at 21:14




$begingroup$
Typically, consistency means convergence in probability (so you may use the WLLN for i.i.d sequences).
$endgroup$
– d.k.o.
Jan 25 at 21:14












$begingroup$
So if I can show that $forall omega in Omega,hat{vartheta}_{n}(omega)xrightarrow{n to infty} vartheta(P)(omega)$ for any $P in mathcal{P}$ then this would be convergence almost certainly and thereby convergence in probability?
$endgroup$
– MinaThuma
Jan 25 at 23:47






$begingroup$
So if I can show that $forall omega in Omega,hat{vartheta}_{n}(omega)xrightarrow{n to infty} vartheta(P)(omega)$ for any $P in mathcal{P}$ then this would be convergence almost certainly and thereby convergence in probability?
$endgroup$
– MinaThuma
Jan 25 at 23:47














$begingroup$
How are you going to show pointwise convergence?
$endgroup$
– d.k.o.
Jan 25 at 23:55




$begingroup$
How are you going to show pointwise convergence?
$endgroup$
– d.k.o.
Jan 25 at 23:55










0






active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3087245%2fdoes-pointwise-convergence-of-estimator-imply-consistency%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3087245%2fdoes-pointwise-convergence-of-estimator-imply-consistency%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Mario Kart Wii

The Binding of Isaac: Rebirth/Afterbirth

What does “Dominus providebit” mean?