# significance test

I've evaluated papers by counting the language circulations of the write-ups.

The outcomes resemble that:

Day 1               Day 2              Day 3

Economy             Economy            Economy
language 1: 0,35    language 1: 0,30   language 1: 0,90
language 2: 0,11    language 2: 0,10   language 2: 0,00
language 3: 0,54    language 3: 0,60   language 3: 0,10

Sports              Sports             Sports
language 1: 0,40    language 1: 0,30   language 1: 1.00
language 2: 0,20    language 2: 0,20   language 2: 0,00
language 3: 0,40    language 3: 0,50   language 3: 0,00


I've have actually currently uploaded an additional inquiry on that particular subject (), yet below comes my 2nd trouble. First off, I intend to remove all analytical outliers from information (as an example day 3), to make it "clean" (see my various other inquiry (). Afterwards, I intend to end which adjustments in my information are simply "noise" and also witch are substantial adjustments. Yet I'm not exactly sure just how to do it.

I was considering the adhering to approach:

I can compute the standard deviation (like in my ) and also deal with every value beyond it as a "significant change". Yet I assume this will certainly create a blunder if all my values are a little raising or lowering.

Exists any kind of mathematical strategy to locate the substantial adjustments in my information?

Many thanks beforehand.

(PS: I have just a couple of examples (~ 150 days).)

2
2022-07-25 20:40:24
Source Share