Which Premier League Teams Overperform Their xG?

Every season in the Premier League, a handful of teams score noticeably more goals than their Expected Goals data says they should. Their xG says 55 goals across the season, but they bag 65. Or their xG says 1.2 per game, but they average 1.6.

This is xG overperformance, and it raises one of the most debated questions in football analytics: is it skill, or is it luck?

The answer, frustratingly, is both. But not in equal measure. And understanding which part is skill and which part is luck is crucial for making good predictions.

What Overperformance Looks Like in the Data

When we talk about xG overperformance, we mean the gap between actual goals scored and xG — specifically, when actual goals exceed xG by a significant margin.

A team that creates 1.5 xG per game and scores 1.5 goals per game is performing right on expectation. A team that creates 1.5 xG per game and scores 2.0 goals per game is overperforming by 0.5 goals per match — which over a 38-game season is roughly 19 extra goals. That's an enormous number. It's the difference between finishing sixth and finishing second.

The key question is whether that 0.5 goals per game of overperformance will continue next season, or whether the team will regress to scoring closer to their xG.

The Case for Skill: Clinical Finishing Is Real

Let's start with the argument that overperformance can be a genuine skill.

Elite strikers make a difference

Not all players are created equal. A chance worth 0.15 xG when taken by the average player might be worth 0.25 when taken by an elite finisher. The xG model uses the average conversion rate for that type of chance, but Mo Salah, Erling Haaland, or Harry Kane are not average finishers. They consistently place shots more accurately, make better decisions about when to shoot, and execute under pressure at a higher level.

This is measurable. If you track individual players' xG overperformance across multiple seasons, the best finishers do sustain a positive gap. Haaland, for example, arrived at Manchester City with a career record of significantly outscoring his xG. He then continued to outscore his xG in the Premier League. That's not luck — it's a player whose finishing ability genuinely exceeds what the positional model expects.

Shot placement matters

xG models primarily measure where the shot was taken from, not where it goes. Two shots from 12 yards out in a central position might both be valued at 0.15 xG, but one is drilled low into the bottom corner and the other is tamely hit straight at the goalkeeper. The outcome is different, and the quality of the strike is the differentiator.

Teams with technically superior finishers will, all else being equal, convert a higher proportion of their chances. This shows up as xG overperformance.

Some systems create better chances than xG captures

There's also an argument that certain playing styles create chances that are better than the xG model suggests. A team that plays rapid, incisive passing football might arrive at shooting positions with the defence out of position and the goalkeeper unsighted — contextual factors that basic xG models undervalue.

Pre-shot xG models that account for defensive positioning tend to rate these chances higher, which suggests the "overperformance" is partly an artefact of the model not fully capturing the quality of the chance.

The Case for Luck: Regression Is Coming

Now for the other side. And this is where the data is pretty emphatic.

Most overperformance doesn't persist

Season after season, the research shows the same thing: teams that significantly overperform their xG in one campaign tend to score closer to their xG in the next. The correlation between xG overperformance in consecutive seasons is weak. Very weak.

This doesn't mean it drops to zero — there is a small, persistent skill component. But the bulk of any single-season overperformance is variance. A team that outscored their xG by 15 goals last season is far more likely to outscore it by 3 or 4 goals next season than by 15 again.

The randomness of goalkeeping

A significant chunk of xG overperformance comes not from the shooting team's skill but from the opposing goalkeeper having a bad day. Every team faces goalkeepers who are beaten by shots they'd normally save — the timing is random. Over a season, some teams benefit from this more than others, simply by chance.

Hot streaks aren't predictive

A striker who scores 10 goals from 5 xG in the first half of the season might look like a clinical finisher. But often, the second half of the season sees a return to normal rates. What looked like elite finishing was actually a hot streak — an unsustainable run where everything hit the net.

The human brain is wired to see patterns and assign narratives. "He's in the form of his life" is more compelling than "he's experiencing positive variance." But the latter is usually more accurate.

What the Historical Data Shows

Let's look at some patterns from recent Premier League seasons.

Teams with elite strikers DO sustain mild overperformance

When Manchester City signed Haaland, their xG overperformance was notable. This makes sense — Haaland's finishing quality is exceptional, and placing him at the end of City's high-xG-output system was always likely to produce actual goals above the model's expectation.

But even with Haaland, the overperformance has been in the range of 5-10 goals across a season — significant, but not the 15-20 goal outliers you sometimes see from teams that are riding pure variance.

Mid-table overperformers almost always regress

When a mid-table team — let's say a newly promoted side or an established lower-half team — finishes with actual goals significantly above their xG, it's almost always followed by a drop-off. These teams typically don't have the elite finishing talent to sustain overperformance, and the next season their goal output falls back in line with (or below) their xG.

This pattern is one of the most reliable in Premier League analytics. A mid-table team that overperformed their xG by 10+ goals is a strong regression candidate for the following season.

The biggest single-season overperformances are almost never repeated

Across the last decade of Premier League data, the teams with the largest positive gaps between actual goals and xG in a single season almost never replicate that gap the following year. The correlation is close to zero for extreme overperformers. Some of these teams don't even finish in the same half of the table.

Counter-Attacking Teams: A Special Case

Teams that play primarily on the counter tend to generate a specific type of chance: fast breaks against disorganised defences. These chances sometimes have a higher conversion rate than their raw xG suggests, because the xG model can't fully capture the chaos and defensive disorganisation of a team caught on the break.

This is a genuine but limited edge. It works brilliantly against attacking teams who leave space behind, but offers little advantage against low blocks. And it depends heavily on having the right personnel — pacey forwards who can finish in transition are a specific skill set.

How to Think About Overperformance for Predictions

If you're using xG data to predict future performance, here's the framework:

Step 1: Identify the overperformance

Calculate the gap between actual goals and xG for the team you're analysing. If it's more than 5 goals positive over a season, there's significant overperformance to account for.

Step 2: Assess the source

Is the overperformance concentrated in one player? If so, is that player an established elite finisher with a multi-season track record of outscoring xG, or is it a player having a career-best year? The former is more sustainable than the latter.

Is the overperformance spread across the squad? If so, it's almost certainly variance and will regress.

Step 3: Adjust your expectations

For prediction purposes, you should assume that most xG overperformance will partially regress. A reasonable rule of thumb:

  • Teams with an elite proven finisher: Expect roughly 30-40% of the overperformance to persist
  • Teams without an elite finisher: Expect roughly 10-20% to persist
  • Teams with extreme overperformance (15+ goals above xG): Expect significant regression regardless of squad quality

Step 4: Focus on the xG, not the goals

When assessing how good a team is, give more weight to their xG than their actual goals. A team creating 2.0 xG per game is more likely to remain strong than a team creating 1.4 xG per game but scoring 1.9 due to overperformance. The former has a process that generates quality chances. The latter is relying on converting at an unsustainable rate.

The Underperformance Flip Side

Perhaps even more useful for predictions than overperformance is its opposite. Teams that significantly underperform their xG — creating loads of chances but not scoring — are among the strongest "buy low" candidates in football analytics.

Brighton under De Zerbi were the textbook example: consistently generating 2.0+ xG per game but failing to convert at the expected rate. Their underlying numbers screamed top-six quality while their results said mid-table. History tells us that kind of underperformance tends to correct itself. The goals eventually come.

Why This Matters

Understanding xG overperformance isn't just an academic exercise. It has direct implications for prediction:

  • A team that won the league partly through xG overperformance is more vulnerable than the table suggests
  • A team that finished mid-table despite strong xG numbers but actual underperformance is a candidate for improvement
  • Transfer decisions that chase goals without understanding xG can lead clubs to overpay for strikers having unsustainable seasons

The smart analysis — whether for predictions, fantasy football, or just informed pub debate — always looks at the underlying data rather than just the goals column.

Conclusion

xG overperformance is one of the most misunderstood concepts in football analytics. Yes, finishing quality is real, and yes, elite strikers do sustain mild overperformance. But the majority of any single-season overperformance is variance — luck that won't persist.

For prediction, the lesson is clear: trust the xG more than the goals. A team's chance creation and chance prevention — measured by xG for and xG against — is far more predictive of future performance than their actual goal tallies. When a team's goals significantly exceed their xG, be cautious about assuming it will continue. More often than not, the regression is coming.

And when it comes, the form table won't see it. But the xG data will have been warning you all along.

Advertisement