5,990
Views
10
CrossRef citations to date
0
Altmetric
Articles

Toward a Better Understanding of News User Journeys: A Markov Chain Approach

ORCID Icon & ORCID Icon

Figures & data

Figure 1. Example of a two-state Markov chain.

Figure 1. Example of a two-state Markov chain.

Table 1. An exemplary selection of the data from user Z.

Table 2. Transition probabilities training set (80%)—six states.

Table 3. Transition probabilities test set (20%)—six states.

Figure 2. The probability of users changing from one news topic to another news topic: Homepage or section page, Politics, Business, Entertainment, Other, and End of Web session.

Figure 2. The probability of users changing from one news topic to another news topic: Homepage or section page, Politics, Business, Entertainment, Other, and End of Web session.

Figure 3. The probability of users changing between hard and soft news: Homepage or section page, Hard news, Soft news, and End of Web session.

Figure 3. The probability of users changing between hard and soft news: Homepage or section page, Hard news, Soft news, and End of Web session.

Table 4. Transition probabilities training set (80%)—four states

Table 5. Transition probabilities test set (20%)—four states.