mirror of
https://github.com/qbittorrent/qBittorrent.git
synced 2026-03-02 22:57:32 -05:00
Failed to load the torrent: metadata too large #1637
Labels
No labels
Accessibility
AppImage
Bounty
Build system
CI
Can't reproduce
Code cleanup
Confirmed bug
Confirmed bug
Core
Crash
Data loss
Discussion
Docker
Documentation
Duplicate
Feature
Feature request
Feature request
Feature request
Filters
Flatpak
GUI
Has workaround
I2P
Invalid
Libtorrent
Look and feel
Meta
NSIS
Network
Not an issue
OS: *BSD
OS: Linux
OS: Windows
OS: macOS
PPA
Performance
Project management
Proxy/VPN
Qt bugs
Qt6 compat
RSS
Search engine
Security
Temp folder
Themes
Translations
Triggers
Waiting diagnosis
Waiting info
Waiting upstream
Waiting web implementation
Watched folders
WebAPI
WebUI
autoCloseOldIssue
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/qBittorrent#1637
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @qqqqq8 on GitHub (Aug 21, 2014).
When I try to open this torrent file I get this error (Failed to load the torrent: metadata too large)
http://www.nyaa.se/?page=view&tid=473067
@SamyCookie commented on GitHub (Aug 21, 2014):
Seems to be the same issue as #1041 and #1459. It may be fixed when #894 will be implemented.
@sledgehammer999 commented on GitHub (Oct 5, 2014):
Reopening because it isn't fixed. And marking for v3.1.11
@sledgehammer999 commented on GitHub (Oct 22, 2014):
I am transferring this for v3.2.0.
The solution is quite intrusive code-wise and I am afraid that I might introduce a bug. If we v3.1.11 wasn't the last of the v3.1.x I would make the change.
@sorokin commented on GitHub (Oct 22, 2014):
@sledgehammer999 Why is it intrusive?
Doesn't replacing 8000000 with something greater work?
@sledgehammer999 commented on GitHub (Oct 22, 2014):
That particular function wasn't supposed to be used as public API, that's why it is removed in libtorrent 1.0.0. (I have asked arvid).
So now we either use the constructor that takes buffer or the constructor that takes a lazy_entry.
Either way we have to create IMO a wrapper function, we construct torrent_info objects in various places.
It's true that the solution might seem simple enough, but I don't want to risk it this late to the series for ONE instance of a big torrent.
@sorokin commented on GitHub (Oct 22, 2014):
I mean could you just change this in code of libtorrent for windows builds?
Then we could ask arvid for increasing limit or expose this limit parameter in public API for us or read file manually and initialize torrent_info from lazy_entry.
@sledgehammer999 commented on GitHub (Oct 22, 2014):
This isn't a true solution. What would happen with the rest of the OSes?
I don't believe in poking other libs, unless the patch is to be merged upstream.
@sorokin commented on GitHub (Oct 22, 2014):
Perhaps you are right.
I thought we could just hack it to make user happy. And then for 3.2.0 we could develop a proper workaround or change an interface of libtorrent.
I've read the conversation in libtorrent-discuss, I don't understand why arvid doesn't want simply to bump this limit.
@sledgehammer999 commented on GitHub (Oct 22, 2014):
He said something about API and ABI compatibility. I am not sure when ABI breakage occurs, but I assume that this is one of the cases.
@Belove0 commented on GitHub (Oct 22, 2014):
Paradoxical workaround for that particular content, for those with the storage space, is probably the original lossless content 1.25 TiB version of the torrent, with an "only" 7.01MiB torrent file. Untested.
@alexeightsix commented on GitHub (Mar 31, 2015):
was this ever fixed??
@sledgehammer999 commented on GitHub (Mar 31, 2015):
Yes, but try the v3.2.0beta builds on the forum.
I don't think that I backported this in the v3.1.x series.
@illeatmyhat commented on GitHub (May 1, 2016):
I've tried this on v3.3.4 and I get the same error.