Failed to load the torrent: metadata too large #1637

Closed
opened 2026-02-21 15:46:11 -05:00 by deekerman · 13 comments
Owner

Originally created by @qqqqq8 on GitHub (Aug 21, 2014).

When I try to open this torrent file I get this error (Failed to load the torrent: metadata too large)
http://www.nyaa.se/?page=view&tid=473067

Originally created by @qqqqq8 on GitHub (Aug 21, 2014). When I try to open this torrent file I get this error (Failed to load the torrent: metadata too large) http://www.nyaa.se/?page=view&tid=473067
Author
Owner

@SamyCookie commented on GitHub (Aug 21, 2014):

Seems to be the same issue as #1041 and #1459. It may be fixed when #894 will be implemented.

@SamyCookie commented on GitHub (Aug 21, 2014): Seems to be the same issue as #1041 and #1459. It may be fixed when #894 will be implemented.
Author
Owner

@sledgehammer999 commented on GitHub (Oct 5, 2014):

Reopening because it isn't fixed. And marking for v3.1.11

@sledgehammer999 commented on GitHub (Oct 5, 2014): Reopening because it isn't fixed. And marking for v3.1.11
Author
Owner

@sledgehammer999 commented on GitHub (Oct 22, 2014):

I am transferring this for v3.2.0.
The solution is quite intrusive code-wise and I am afraid that I might introduce a bug. If we v3.1.11 wasn't the last of the v3.1.x I would make the change.

@sledgehammer999 commented on GitHub (Oct 22, 2014): I am transferring this for v3.2.0. The solution is quite intrusive code-wise and I am afraid that I might introduce a bug. If we v3.1.11 wasn't the last of the v3.1.x I would make the change.
Author
Owner

@sorokin commented on GitHub (Oct 22, 2014):

@sledgehammer999 Why is it intrusive?

// torrent_info.cpp:485
int load_file(std::string const& filename, std::vector<char>& v, error_code& ec, int limit = 8000000)
{
    ec.clear();
    file f;
    if (!f.open(filename, file::read_only, ec)) return -1;
    size_type s = f.get_size(ec);
    if (ec) return -1;
    if (s > limit)
    {
        ec = error_code(errors::metadata_too_large, get_libtorrent_category());
        return -2;
    }

Doesn't replacing 8000000 with something greater work?

@sorokin commented on GitHub (Oct 22, 2014): @sledgehammer999 Why is it intrusive? ``` // torrent_info.cpp:485 int load_file(std::string const& filename, std::vector<char>& v, error_code& ec, int limit = 8000000) { ec.clear(); file f; if (!f.open(filename, file::read_only, ec)) return -1; size_type s = f.get_size(ec); if (ec) return -1; if (s > limit) { ec = error_code(errors::metadata_too_large, get_libtorrent_category()); return -2; } ``` Doesn't replacing 8000000 with something greater work?
Author
Owner

@sledgehammer999 commented on GitHub (Oct 22, 2014):

That particular function wasn't supposed to be used as public API, that's why it is removed in libtorrent 1.0.0. (I have asked arvid).
So now we either use the constructor that takes buffer or the constructor that takes a lazy_entry.
Either way we have to create IMO a wrapper function, we construct torrent_info objects in various places.
It's true that the solution might seem simple enough, but I don't want to risk it this late to the series for ONE instance of a big torrent.

@sledgehammer999 commented on GitHub (Oct 22, 2014): That particular function wasn't supposed to be used as public API, that's why it is removed in libtorrent 1.0.0. (I have asked arvid). So now we either use the constructor that takes buffer or the constructor that takes a lazy_entry. Either way we have to create IMO a wrapper function, we construct torrent_info objects in various places. It's true that the solution might seem simple enough, but I don't want to risk it this late to the series for ONE instance of a big torrent.
Author
Owner

@sorokin commented on GitHub (Oct 22, 2014):

I mean could you just change this in code of libtorrent for windows builds?

Then we could ask arvid for increasing limit or expose this limit parameter in public API for us or read file manually and initialize torrent_info from lazy_entry.

@sorokin commented on GitHub (Oct 22, 2014): I mean could you just change this in code of libtorrent for windows builds? Then we could ask arvid for increasing limit or expose this limit parameter in public API for us or read file manually and initialize torrent_info from lazy_entry.
Author
Owner

@sledgehammer999 commented on GitHub (Oct 22, 2014):

I mean could you just change this in code of libtorrent for windows builds?

This isn't a true solution. What would happen with the rest of the OSes?
I don't believe in poking other libs, unless the patch is to be merged upstream.

@sledgehammer999 commented on GitHub (Oct 22, 2014): > I mean could you just change this in code of libtorrent for windows builds? This isn't a true solution. What would happen with the rest of the OSes? I don't believe in poking other libs, unless the patch is to be merged upstream.
Author
Owner

@sorokin commented on GitHub (Oct 22, 2014):

I don't believe in poking other libs, unless the patch is to be merged upstream.

Perhaps you are right.

I thought we could just hack it to make user happy. And then for 3.2.0 we could develop a proper workaround or change an interface of libtorrent.

I've read the conversation in libtorrent-discuss, I don't understand why arvid doesn't want simply to bump this limit.

@sorokin commented on GitHub (Oct 22, 2014): > I don't believe in poking other libs, unless the patch is to be merged upstream. Perhaps you are right. I thought we could just hack it to make user happy. And then for 3.2.0 we could develop a proper workaround or change an interface of libtorrent. I've read the conversation in libtorrent-discuss, I don't understand why arvid doesn't want simply to bump this limit.
Author
Owner

@sledgehammer999 commented on GitHub (Oct 22, 2014):

He said something about API and ABI compatibility. I am not sure when ABI breakage occurs, but I assume that this is one of the cases.

@sledgehammer999 commented on GitHub (Oct 22, 2014): He said something about API and ABI compatibility. I am not sure when ABI breakage occurs, but I assume that this is one of the cases.
Author
Owner

@Belove0 commented on GitHub (Oct 22, 2014):

Paradoxical workaround for that particular content, for those with the storage space, is probably the original lossless content 1.25 TiB version of the torrent, with an "only" 7.01MiB torrent file. Untested.

@Belove0 commented on GitHub (Oct 22, 2014): Paradoxical workaround for that particular content, for those with the storage space, is probably the original lossless content 1.25 TiB version of the torrent, with an "only" 7.01MiB torrent file. Untested.
Author
Owner

@alexeightsix commented on GitHub (Mar 31, 2015):

was this ever fixed??

@alexeightsix commented on GitHub (Mar 31, 2015): was this ever fixed??
Author
Owner

@sledgehammer999 commented on GitHub (Mar 31, 2015):

Yes, but try the v3.2.0beta builds on the forum.
I don't think that I backported this in the v3.1.x series.

@sledgehammer999 commented on GitHub (Mar 31, 2015): Yes, but try the v3.2.0beta builds on the forum. I don't think that I backported this in the v3.1.x series.
Author
Owner

@illeatmyhat commented on GitHub (May 1, 2016):

I've tried this on v3.3.4 and I get the same error.

@illeatmyhat commented on GitHub (May 1, 2016): I've tried this on v3.3.4 and I get the same error.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference
starred/qBittorrent#1637
No description provided.