HTTP GET sees data only after request completion

Apologies if I’m missing something obvious, I’m new to both web dev and playdate.

I’m trying out the new HTTP API, but it seems for certain websites playdate only sees data after the request has completed. Code to reproduce (modified & simplified from the networking example):

import "CoreLibs/utilities/where"
import "CoreLibs/object"

local net <const> = playdate.network

playdate.display.setRefreshRate(50)

local url = "google.com"
local http_done = false
local http_waiting = false
local http_data_received = false
local http_conn = nil
local start_time <const> = playdate.getCurrentTimeMilliseconds()

function timeLog(text)
    now = playdate.getCurrentTimeMilliseconds()
    elapsed = now - start_time
    print(string.format("[%i] %s", elapsed, text))
end

function headersRead()
    timeLog("HTTP headersRead called")
    local response = http_conn:getResponseStatus()
    timeLog(string.format("\tHTTP GET getResponseStatus: %i", response))
end

function connectionClosed()
    timeLog("HTTP connectionClosed called")
end

function requestComplete()
    timeLog("HTTP requestComplete called")
    local bytes = http_conn:getBytesAvailable()
    timeLog(string.format("\tHTTP GET getBytesAvailable: %i", bytes))
end

function requestCallback()
    timeLog("HTTP requestCallback called, "..http_conn:getBytesAvailable().." bytes are available for reading")
    local bytes = http_conn:getBytesAvailable()
    local data = http_conn:read(bytes)
    print(data)
    http_data_received = true
end

function playdate.update()

    if not http_done then
        if not http_waiting then

            http_conn = net.http.new(url, nil, true, "HTTP Demo")
            assert(http_conn, "The user needs to allows this")

            http_conn:setHeadersReadCallback(headersRead)
            http_conn:setConnectionClosedCallback(connectionClosed)
            http_conn:setRequestCompleteCallback(requestComplete)
            http_conn:setRequestCallback(requestCallback)

            http_conn:setConnectTimeout(2)
            http_conn:setKeepAlive(true)

            local get_request, get_error = http_conn:get("/")
            assert(get_request, get_error)

            http_waiting = true

        else
            if http_data_received then
                http_conn:close()
                http_done = true
            else
                local bytes = http_conn:getBytesAvailable()
                timeLog(string.format("\tHTTP GET getBytesAvailable: %i", bytes))

                local wait_time = 100
                timeLog(string.format("HTTP Waiting %i ms for data: %s", wait_time, http_conn:getError() or "OK"))
                playdate.wait(wait_time)
            end
        end
    end

end

This would produce a log like the following:

[3397] HTTP Waiting 100 ms for data: OK
[3536] 	HTTP GET getBytesAvailable: 0
[3537] HTTP Waiting 100 ms for data: OK
[3676] 	HTTP GET getBytesAvailable: 0
[3678] HTTP Waiting 100 ms for data: OK
[3717] HTTP headersRead called
[3718] 	HTTP GET getResponseStatus: 0
[3718] HTTP requestComplete called
[3719] 	HTTP GET getBytesAvailable: 0
[3817] 	HTTP GET getBytesAvailable: 5069
[3818] HTTP Waiting 100 ms for data: OK
[3957] 	HTTP GET getBytesAvailable: 18645
[3959] HTTP Waiting 100 ms for data: OK
[4098] 	HTTP GET getBytesAvailable: 18645
[4099] HTTP Waiting 100 ms for data: OK
[4239] 	HTTP GET getBytesAvailable: 18645

Where one can see that: headersRead returns a 0 status code, the requestCallback was never called, requestComplete callback was called but saw no data, but manually polling for data sees bytes available after the request completes.

However, everything works fine if I change the url from google to e.g. example.com.

Finally had a chance to look into this, sorry for the delay! It looks like this is only affecting the WX simulator, works correctly on the device. I don't know exactly what the problem is yet but I noticed the request was getting a 301 redirect from google.com to www.google.com. I changed the server address in the demo to avoid the redirect and it does work correctly after that. Since there's a workaround I'm not going to hold up 3.0.1 release for this but I've got it assigned to 3.0.2. Thanks for catching this!

Thank you for looking into this! Adding www worked for google but not for some other websites, for example www.csmonitor.com. In general the request API seems pretty sensitive to link formatting, e.g. dropping a trailing slash can break things.

So the issues above only occur in the simulator and work fine on device. However I ran into a new issue that also occurs on device: calling get("/somepath") doesn’t follow the path but instead receives the content at server root, i.e. the content of get("/"):

import "CoreLibs/graphics"
import "CoreLibs/utilities/where"

local gfx <const> = playdate.graphics
local net <const> = playdate.network

local http = {
    conn = nil,
    waiting = false,
    done = false,
    closed = false,
    bytes = 0,
    body = "",
    error = nil,
}

local function startHttpRequest()
    http.conn = net.http.new("text.npr.org")
    assert(http.conn, "Please allow this app to access the network")

    http.conn:setHeadersReadCallback(function ()
        local status = http.conn:getResponseStatus()
    end)

    http.conn:setRequestCallback(function ()
        local available = http.conn:getBytesAvailable() or 0
        if available <= 0 then
            return
        end
        local chunk = http.conn:read(available) or ""
        print(chunk)
        http.bytes = http.bytes + #chunk
        http.body = http.body .. chunk
    end)

    http.conn:setRequestCompleteCallback(function ()
        http.done = true
        http.error = http.conn:getError()
    end)

    http.conn:setConnectionClosedCallback(function ()
        http.closed = true
    end)

    http.conn:setConnectTimeout(2)
    http.conn:setKeepAlive(false)

    local ok, err = http.conn:get("/nx-s1-5603659")
    assert(ok, err)
    http.waiting = true
end

function playdate.update()
    gfx.clear()

    if not http.waiting then
        startHttpRequest()
    end

    if http.done and http.conn ~= nil and not http.closed then
        http.conn:close()
    end
end

Thanks for the report (and sorry as always for the slow reply)! I'll take a look when I have a chance. My bet is it's following a redirect and not requesting the correct path from the new location, but we'll see. I've got it on my todo list, and I also filed an issue and set a due date of next Monday on it--hopefully that'll keep this from getting lost in the chaos. :crossed_fingers:

1 Like

Turns out their redirect is goofed. :frowning:

dave@macbook ~ % telnet text.npr.org 80
Trying 2600:1409:9800:1d::17d8:9108...
Connected to e103193.dsca.akamaiedge.net.
Escape character is '^]'.
GET /nx-s1-5603659 HTTP/1.1
Host: text.npr.org

HTTP/1.1 301 Moved Permanently
Server: AkamaiGHost
Content-Length: 0
Location: https://text.npr.org/
Date: Wed, 19 Nov 2025 01:33:10 GMT
Connection: keep-alive

In this case if you use https by doing conn = net.http.new("text.npr.org", true) it'll avoid the redirect and work as expected.

Aha, this also points out my original code net.http.new(url, nil, true, "HTTP Demo" made the mistake of passing nil as the port, which I thought would get set to the default SSL port 443.

So the current workaround is always include www (google.com still doesn’t work), and always use SSL with http.new(url, true). Basically try to avoid getting redirected I guess.

Given that, would it make sense to default to using SSL in http.new?

tbh I don't love the idea of something called "http" using https by default. What I'd rather do is make a separate playdate.network.https.new() function which actually returns a playdate.network.http object but sets a flag that tells it to use ssl. It's a bit of a hack but it makes the game code easier to read. I really hate the arbitrary "true" argument for enabling ssl, totally non-obvious.

Regarding the original problem: I discovered a race condition in the redirect handling that's causing this. It'll take some low-level surgery to fix and we'll need to make sure we test it thoroughly, so I'm targeting that for 3.1--and the https thing too, since it's an API addition. That'll be some time after the new year, hopefully not too long.

1 Like