Blog

Home Blog
Meta-analysis as an analytical tool

Meta-analysis as an analytical tool

In today's scientific and research world, analysts are often confronted with the problem of analysing large amounts of data coming from different studies. In such situations, meta-analysis becomes an indispensable tool. It allows the results of many studies to be assessed collectively and more prec…

Read more
General linear models and generalised linear models - differences and similarities

General linear models and generalised linear models - differences and similarities

In data analysis, the use of general linear models is common due to their simplicity and ease of interpretation of the results obtained. However, there are times when the analyst encounters situations where the assumptions of classical linear models are difficult or impossible to meet. This may be …

Read more
Bayesian inference

Bayesian inference

Bayesian inference is a method of statistical inference. It is named after Thomas Bayes, the British mathematician and pastor who first formulated Bayesian probability theory in the 18th century. It is a method of data analysis that allows the probability of certain events to be determined not only…

Read more
Data gaps in quantitative data analysis - what are they and how to deal with them?

Data gaps in quantitative data analysis - what are they and how to deal with them?

Missing data in the context of data analysis refers to situations where there are no values for certain variables or observations in a dataset. In other words, they are places where a number, text, or some other form of data was expected, but for various reasons was not there. Missing data can take…

Read more
Population pyramid

Population pyramid

When looking for the best way to visualise the data you have, you will come across an impressively wide range of different types of charts - from simple, basic ones such as a scatter plot to very advanced ones such as a Sankey diagram. Some, however, are designed with a specific type of data in min…

Read more
The three sigma rule

The three sigma rule

The three sigma rule is an important tool in statistics and quality management. In the context of data analysis, it allows the identification of outlier points that are significantly different from the rest of the data. The use of the three-sigma rule in quality control also allows anomalies to be …

Read more
Segmentation: from grouping to classification

Segmentation: from grouping to classification

Segmentation is a key process in data analysis, dividing a data set into relatively homogeneous groups based on specific criteria. The purpose of segmentation is to identify hidden patterns, differences and similarities between objects in a dataset, enabling more precise and relevant analyses. Two …

Read more
Recoding quantitative variables into qualitative ones – techniques and their practical application

Recoding quantitative variables into qualitative ones – techniques and their practical application

When analysing the data, we take into account both quantitative information (such as salary, age, number of products ordered) and qualitative information (e.g. gender, education, level of satisfaction with service). In order to make it easier to work with the data or to adapt it to a specific stati…

Read more
Outlier or anomaly? Detection of abnormal observations

Outlier or anomaly? Detection of abnormal observations

Can one abnormal occurrence cause concern? Based on one deviation from the norm, should a red light start flashing? Of course! In many industries and businesses, an anomaly is a sign that must be reacted to quickly and efficiently in order to prevent consequences. So how do you recognise an anomaly…

Read more
Accessibility settings
Line height
Letter spacing
No animations
Reading line
Speech
No images
Focus on content
Bigger cursor
Hotkeys