---
title: "💭 A ChatGPT prompt equals about 5.1 seconds of Netflix"
description: "!https://simonwillison.net/2025/Nov/29/chatgpt-netflix"
date: 2025-12-01
published: true
tags:
  - ai
  - thought
template: link
---


<div class="embed-card embed-card-external">
  <a href="https://simonwillison.net/2025/Nov/29/chatgpt-netflix" class="embed-card-link" target="_blank" rel="noopener noreferrer">
    <div class="embed-card-content">
      <div class="embed-card-title">A ChatGPT prompt equals about 5.1 seconds of Netflix</div>
      <div class="embed-card-description">In June 2025 Sam Altman claimed about ChatGPT that &#34;the average query uses about 0.34 watt-hours&#34;. In March 2020 George Kamiya of the International Energy Agency estimated that &#34;streaming a …</div>
      <div class="embed-card-meta">Simon Willison’s Weblog &middot; simonwillison.net</div>
    </div>
  </a>
</div>


This feels very promising for the future as we enter a world that is more and more dependent on AI that inference is so cheap.  I did not understand the scale to how much cheaper inference is compared to training.  As we get better with training I imagine this gets significantly better as well.  I know they all claim to be profitable on inference, but scrolling through Simon's feed here you see several articles on the stark difference.

!!! note

    This post is a <a href="/thoughts/" class="wikilink" data-title="Thoughts" data-description="These are generally my thoughts on a web page or some sort of url, except a rare few don&#39;t have a link. These are dual published off of my..." data-date="2024-04-01">thought</a>. It's a short note that I make
    about someone else's content online <a href="/tags/thoughts/" class="hashtag-tag" data-tag="thoughts" data-count=2 data-reading-time=3 data-reading-time-text="3 minutes">#thoughts</a>
