Skip to content

Bitswap: litep2p crashes older versions of kubo by omitting wantlist #457

@dmitry-markin

Description

@dmitry-markin

Despite having wantlist always set in Bitswap messages, older kubo IPFS clients with null pointer dereference bug sometime crash when querying the data from litep2p. This happened once with kubo v0.35 on litep2p v0.11.0.

kubo logs:

Kubo version: 0.35.0                                                                                                                                                                                                 
Repo version: 16                                                                                                                                                                                                     
System version: amd64/linux                                                                                                                                                                                          
Golang version: go1.24.3                                                                                                                                                                                             
PeerID: 12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi
2025/10/22 16:42:22 failed to sufficiently increase receive buffer size (was: 208 kiB, wanted: 7168 kiB, got: 416 kiB). See https://github.com/quic-go/quic-go/wiki/UDP-Buffer-Sizes for details.
Swarm listening on 127.0.0.1:4001 (TCP+UDP)
Swarm listening on [::1]:4001 (TCP+UDP)
Run 'ipfs id' to inspect announced and discovered multiaddrs of this node.
RPC API server listening on /ip4/127.0.0.1/tcp/5001
WebUI: http://127.0.0.1:5001/webui
Gateway server listening on /ip4/127.0.0.1/tcp/8080
Daemon is ready                                      
2025-10-22T16:43:23.215+0300    ERROR   cmd/ipfs        kubo/daemon.go:1219
⚠️  A NEW VERSION OF KUBO DETECTED                                                                        

This Kubo node is running an outdated version (0.35.0).
29% of the sampled Kubo peers are running a higher version.
Visit https://github.com/ipfs/kubo/releases or https://dist.ipfs.tech/#kubo and update to version 0.38.1 or later.
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x20 pc=0x179df86]

goroutine 32992 [running]:
github.com/ipfs/boxo/bitswap/message.newMessageFromProto(0xc001de4900)
        github.com/ipfs/[email protected]/bitswap/message/message.go:193 +0x26
github.com/ipfs/boxo/bitswap/message.FromMsgReader({0x74ef5c7b0798, 0xc001913380})
        github.com/ipfs/[email protected]/bitswap/message/message.go:438 +0xaf
github.com/ipfs/boxo/bitswap/network/bsnet.(*impl).handleNewStream(0xc0013aabe0, {0x3a451b0, 0xc001a22c80})
        github.com/ipfs/[email protected]/bitswap/network/bsnet/ipfs_impl.go:442 +0x2a5
github.com/libp2p/go-libp2p/p2p/host/basic.(*BasicHost).SetStreamHandler.func1({0x0?, 0x0?}, {0x74ef5c693348?, 0xc001a22c80?})
        github.com/libp2p/[email protected]/p2p/host/basic/basic_host.go:681 +0x82
github.com/libp2p/go-libp2p/p2p/host/basic.(*BasicHost).newStreamHandler(0xc0007903c0, {0x3a451b0, 0xc001a22c80})
        github.com/libp2p/[email protected]/p2p/host/basic/basic_host.go:493 +0x7f7
github.com/libp2p/go-libp2p/p2p/net/swarm.(*Conn).start.func1.1()
        github.com/libp2p/[email protected]/p2p/net/swarm/swarm_conn.go:155 +0xa5
created by github.com/libp2p/go-libp2p/p2p/net/swarm.(*Conn).start.func1 in goroutine 32419
        github.com/libp2p/[email protected]/p2p/net/swarm/swarm_conn.go:141 +0x1ab

polkadot-bulletin-chain running litep2p-0.11.0 logs:

2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::ipfs::bitswap: handle inbound message peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.410 DEBUG tokio-runtime-worker sub-libp2p::bitswap: handle bitswap request from PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi") for [(Cid(bafk2bzaceddrbd2g6lsyfkoib3wyp25tvdpm624kim6qnb7yxxkiu4i3p2yna), Have)]
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker sub-libp2p::bitswap: found cid Cid(bafk2bzaceddrbd2g6lsyfkoib3wyp25tvdpm624kim6qnb7yxxkiu4i3p2yna), hash 0xc7108f46f2e582a9c80eed87ebb3a8decf6b8a433d0687f8bdd48a711b7eb0d0
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::transport-service: open substream peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi") protocol=/ipfs/bitswap/1.2.0 substream_id=SubstreamId(857) connection_id=ConnectionId(3)
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::transport-service: substream activity peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi") connection_id=ConnectionId(3) self.keep_alive_timeout=3600s last_activity=2 pending_keep_alive_timeouts=4
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::websocket::connection: open substream protocol=Static("/ipfs/bitswap/1.2.0") substream_id=SubstreamId(857)
2025-10-22 16:57:54.410 DEBUG tokio-runtime-worker litep2p::websocket::connection: open substream protocol=Static("/ipfs/bitswap/1.2.0") substream_id=SubstreamId(857)
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::crypto::noise: read data from socket nread=30 ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::crypto::noise: current frame size = 28 ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::crypto::noise: reset read buffer ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::websocket::connection: substream opened substream_id=SubstreamId(857)
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::websocket::connection: negotiating protocols protocols=["/ipfs/bitswap/1.2.0"]
2025-10-22 16:57:54.410 DEBUG tokio-runtime-worker litep2p::multistream-select: Dialer: Proposed protocol: /ipfs/bitswap/1.2.0
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::crypto::noise: read data from socket nread=30 ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::crypto::noise: current frame size = 28 ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.410 TRACE tokio-runtime-worker litep2p::crypto::noise: reset read buffer ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::crypto::noise: read data from socket nread=50 ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::crypto::noise: current frame size = 48 ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::crypto::noise: reset read buffer ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::crypto::noise: read data from socket nread=51 ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::crypto::noise: current frame size = 49 ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::crypto::noise: reset read buffer ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::multistream-select: Received message: Header(V1)
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::multistream-select: Received message: Protocol(Protocol(b"/ipfs/bitswap/1.2.0"))
2025-10-22 16:57:54.411 DEBUG tokio-runtime-worker litep2p::multistream-select: Dialer: Received confirmation for protocol: /ipfs/bitswap/1.2.0
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::websocket::connection: protocol negotiated protocol="/ipfs/bitswap/1.2.0"
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::substream: create new substream for websocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi") codec=UnsignedVarint(Some(2097152))
2025-10-22 16:57:54.411 DEBUG tokio-runtime-worker litep2p::protocol-set: substream opened protocol=/ipfs/bitswap/1.2.0 peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi") direction=Outbound(SubstreamId(857))
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::transport-service: substream activity peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi") connection_id=ConnectionId(3) self.keep_alive_timeout=3600s last_activity=2 pending_keep_alive_timeouts=4
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::substream: send framed peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi") codec=UnsignedVarint(Some(2097152)) frame_len=44
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::crypto::noise: read data from socket nread=30 ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::crypto::noise: current frame size = 28 ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.411 TRACE tokio-runtime-worker litep2p::crypto::noise: reset read buffer ty=WebSocket peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi")
2025-10-22 16:57:54.437 DEBUG tokio-runtime-worker litep2p::websocket::connection: connection closed with error peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi") error=Decode(Io(Kind(UnexpectedEof)))
2025-10-22 16:57:54.437 TRACE tokio-runtime-worker litep2p::transport-service: primary connection closed peer=PeerId("12D3KooWPCSZa13bepfX22cgkMaHMwDpyoQRQaLmZkn5qDN6vwKi") connection_id=ConnectionId(3)

This might be because prost doesn't serialize message fields when they contain default values, but this does not explain why this issue was observed only once.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    Status

    No status

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions