HTTP GET sees data only after request completion

Apologies if I’m missing something obvious, I’m new to both web dev and playdate.

I’m trying out the new HTTP API, but it seems for certain websites playdate only sees data after the request has completed. Code to reproduce (modified & simplified from the networking example):

import "CoreLibs/utilities/where"
import "CoreLibs/object"

local net <const> = playdate.network

playdate.display.setRefreshRate(50)

local url = "google.com"
local http_done = false
local http_waiting = false
local http_data_received = false
local http_conn = nil
local start_time <const> = playdate.getCurrentTimeMilliseconds()

function timeLog(text)
    now = playdate.getCurrentTimeMilliseconds()
    elapsed = now - start_time
    print(string.format("[%i] %s", elapsed, text))
end

function headersRead()
    timeLog("HTTP headersRead called")
    local response = http_conn:getResponseStatus()
    timeLog(string.format("\tHTTP GET getResponseStatus: %i", response))
end

function connectionClosed()
    timeLog("HTTP connectionClosed called")
end

function requestComplete()
    timeLog("HTTP requestComplete called")
    local bytes = http_conn:getBytesAvailable()
    timeLog(string.format("\tHTTP GET getBytesAvailable: %i", bytes))
end

function requestCallback()
    timeLog("HTTP requestCallback called, "..http_conn:getBytesAvailable().." bytes are available for reading")
    local bytes = http_conn:getBytesAvailable()
    local data = http_conn:read(bytes)
    print(data)
    http_data_received = true
end

function playdate.update()

    if not http_done then
        if not http_waiting then

            http_conn = net.http.new(url, nil, true, "HTTP Demo")
            assert(http_conn, "The user needs to allows this")

            http_conn:setHeadersReadCallback(headersRead)
            http_conn:setConnectionClosedCallback(connectionClosed)
            http_conn:setRequestCompleteCallback(requestComplete)
            http_conn:setRequestCallback(requestCallback)

            http_conn:setConnectTimeout(2)
            http_conn:setKeepAlive(true)

            local get_request, get_error = http_conn:get("/")
            assert(get_request, get_error)

            http_waiting = true

        else
            if http_data_received then
                http_conn:close()
                http_done = true
            else
                local bytes = http_conn:getBytesAvailable()
                timeLog(string.format("\tHTTP GET getBytesAvailable: %i", bytes))

                local wait_time = 100
                timeLog(string.format("HTTP Waiting %i ms for data: %s", wait_time, http_conn:getError() or "OK"))
                playdate.wait(wait_time)
            end
        end
    end

end

This would produce a log like the following:

[3397] HTTP Waiting 100 ms for data: OK
[3536] 	HTTP GET getBytesAvailable: 0
[3537] HTTP Waiting 100 ms for data: OK
[3676] 	HTTP GET getBytesAvailable: 0
[3678] HTTP Waiting 100 ms for data: OK
[3717] HTTP headersRead called
[3718] 	HTTP GET getResponseStatus: 0
[3718] HTTP requestComplete called
[3719] 	HTTP GET getBytesAvailable: 0
[3817] 	HTTP GET getBytesAvailable: 5069
[3818] HTTP Waiting 100 ms for data: OK
[3957] 	HTTP GET getBytesAvailable: 18645
[3959] HTTP Waiting 100 ms for data: OK
[4098] 	HTTP GET getBytesAvailable: 18645
[4099] HTTP Waiting 100 ms for data: OK
[4239] 	HTTP GET getBytesAvailable: 18645

Where one can see that: headersRead returns a 0 status code, the requestCallback was never called, requestComplete callback was called but saw no data, but manually polling for data sees bytes available after the request completes.

However, everything works fine if I change the url from google to e.g. example.com.

Finally had a chance to look into this, sorry for the delay! It looks like this is only affecting the WX simulator, works correctly on the device. I don't know exactly what the problem is yet but I noticed the request was getting a 301 redirect from google.com to www.google.com. I changed the server address in the demo to avoid the redirect and it does work correctly after that. Since there's a workaround I'm not going to hold up 3.0.1 release for this but I've got it assigned to 3.0.2. Thanks for catching this!

Thank you for looking into this! Adding www worked for google but not for some other websites, for example www.csmonitor.com. In general the request API seems pretty sensitive to link formatting, e.g. dropping a trailing slash can break things.