When Megaupload (and some other sharehosters) died, quite a lot of interesting things just disappeared from the net.
I'm talking about things like small tools that were shared on e.g. xda-developers before Github came, about fan-mods for games etc... The 'big' ones continued living, but if you now e.g. search for a special kernel/ROM for your G1/ADP1 you are mostly out of luck.
It's sad that there's basically no way for an organisation like archive.org to archive things from sharehosters given the unclear (or quite clearly black/gray) law situation and also the missing cooperation with the sharehosters themselves.
>I'm talking about things like small tools that were shared on e.g. xda-developers before Github came, about fan-mods for games etc... The 'big' ones continued living, but if you now e.g. search for a special kernel/ROM for your G1/ADP1 you are mostly out of luck.
This is something that has always sketched me out about the Android rooting and moding scene. "Download this suspect binary from rapid share and run it as root" seemed to be a cornerstone of it.
It's the same with things like game trainers, save editors, etc.. Sign up and give your e-mail address to this forum so that you can see the download link for this file. Which is an outdated version. The new version is in another forum thread on a separate forum, but you have to register to see links. The link is to a rapidshare download page, which rest assured is totally legit.
I honestly don't understand why more people don't create Github accounts and use that to distribute, or at least use their ISP's free web space. Most of these tools have names, are well-known, and are the top hit on Google, but none of them have an actual website that you can go to to see if they've released new versions, something for other games, etc.
Trainers, save editors, translations are basically a common entry point into programming, like a closed knit group of people that love video games and who wants to modify/hack/translate their favourites games. At some point, programming comes to the table and people will start sharing knowledge about it with their own way of doing things. A popular project will get hosted somewhere and the following projects will get hosted there too, simply because they're learning by looking at the popular one. Rapidshare, megauploads, all of these are tools people know before getting into programming, so they just use it. Github is something that comes later, if things get serious.
Dwarf fortress stuff is like that, Minecraft is even worse, you get adfly in the middle :D
dffd exists for things that aren't github(etc)-appropriate such as tools. No real reason to use anything else unless you're trying to monetise it with adf.ly etc (which might also count as a reason not to download).
To me it seems very simple: level of effort. Uploading your hack/mod/whatever to rapidshare takes about one minute, or less. On the other hand, if you want to learn about git and github, you have to spend plenty of time on that.
You /can/, but it's not a use case that github pushes or that people who don't know github would think of. With rapidshare et al you go to the site and it has a big obvious "upload and share a file" button right on the front page.
So? It's possible, and people can make use of it. They may not know, but this can be remedied by communicating. It's also quite a lot better than using some one-off file host such as rapidshare. See Ixiaus' post for more information.
Defaults are important, and what's optimized for one use case isn't optimized for another. An opinionated site that just provides the simplest possible interface for uploading and downloading files is a valuable thing.
(Plus I suspect that if a lot of people started hosting multi-megabyte binaries on github, their policies would change pretty quickly)
There are already many, many mulit-mega binaries hosted on github. It is what the releases feature is for.
But yes, the most simplest site is the best for the most simplest people. However we're talking about people who spent a lot of time into creating their mod/whatever here. They can spend a minute or two more to figure out distribution.
I have no idea why you're being downvoted but you're absolutely right. You can create a new repo, click on "releases", click on "draft new release". Type in the description in markdown (easier than HTML) and upload a binary attachment, hit publish.
Zero knowledge of git was necessary. Oh and also if you want to edit that README.md? You can do it from inside Github too, still zero need to know git.
Github is very handy for someone creating a portfolio and showing off a history of their programming skills. But you could get the pseudo-anonimity with a separate Github account. It's just not worth the effort, the public of these hacks doesn't demand it.
Even better, anybody wanting to distribute something using free services should upload to multiple services (github included). Redundancy is a good thing.
What makes random binaries uploaded to GitHub releases more trusthworthy? You still only have a random online identity linked to it. All trust still comes from the context of the announcement posts.
This is something that has always sketched me out about the Android rooting and moding scene. "Download this suspect binary from rapid share and run it as root" seemed to be a cornerstone of it.
The warez/cracks scene was essentially the same thing, and yet if you knew where you were getting things from, it was quite safe. The antipiracy groups have since been spreading plenty of FUD (and some possibly attaching malware to releases, I don't know) and working with the AV/security industry to make you believe otherwise, however.
Just as a warez/cracks group would be called out for it and very publicly shamed if they put malware in their releases, the same would happen in the Android scene. It's true that there are many rather clueless users (known as "leechers" in the vernacular), but there are also many knowledgeable ones and all it takes is one to give sufficient evidence of malice to trigger the "immune reaction".
But it's even worse than that. People run random operating systems on devices they carry 24/7. Devices with microphones, multiple cameras, access to personal and work email, text messages, passwords, your location.
And there are so many places for things to wrong. Any one of the following could be malicious, incompetent, or compromised:
* The ROM's maintainer. There are many groups here, for example many ROMs are based on ParanoidAndroid, which is based on Cyanogenmod, which is based on AOSP.
* The device maintainer. Typically each brand/model device has its own volunteers to maintain any proprietary blobs or special upgrade process
* The hackers who provide special binaries that root each device, unlock the bootloader, etc.
* The added packages you typically get separately from the ROM, for example Google Apps.
* The build machine, typically just some random box donated semi-anonymously by someone
* The web hosting (without TLS, of course) provided by some other random person.
I love Android. I compile and run my own ROM. But the current scene scares the shit out of me.
It's not clear to me how this differs qualitatively from the current situation with equipment manufacturers all doing their own customizations to devices. Quantitatively there's a difference - a smaller pool of devs/maintainers to potentially subvert and a much smaller pool of potential users vs. a much larger manufacturer dev team and a much larger potential pool of users.
How much would it cost to buy off, for example, the entire radio hardware/firmware team at a manufacturer in your own country (meaning pretty much either China or South Korea), and on a governmental scale how reasonable or unreasonable is that number?
I was thinking just as much as Glenn Greenwald's allegation back in 2013 that the NSA would intercept international shipments of Cisco (and other) equipment, implant backdoors, then send it on its way with factory seals.
Back when I was more involved with security research, and WinXP SP1 was new and shiny, I remember that there were several trojaned XP .isos floating around.
The fact that antipiracy and AV groups have an interest in getting you scared does not mean that there's no reason to be scared of running random binaries you found on the net.
And most people do not feed directly off the warez/cracks hubs - they feed of whatever they can find. Which means a lot of opportunity for bad actors.
>The antipiracy groups have since been spreading plenty of FUD (and some possibly attaching malware to releases, I don't know) and working with the AV/security industry to make you believe otherwise, however.
Isn't this racketeering?
Furthermore, AV programs which classify keygens, etc. in similar categories as keyloggers/ adware, etc. (such as Microsoft Security Essentials) also have a net effect of increasing malware prevalence by training users to ignore AV warnings.
Why are you using the past tense, as if software piracy didn't exist anymore? People in my country pirate everything (including Android apps and games).
I'm using past tense to contrast it with today where the amount of spam/(true) malware-filled content around, especially for popular apps, is much higher than it was before.
This is generally taken care of using checksums, assuming you trust the source. Of course, trusting the source is the hard part. CyanogenMod, for example, has a chain of trust for their releases; though they host their own binaries.
It is too bad that these went down, but yech, sharing binaries or even entire OS images made of open source code... what if you want to change one configuration option or line of code while keeping whatever special modifications were made? What if you want to combine the changes in X fancy ROM with Y? (Never mind that the kernel is GPL.)
I'm not an Android user myself, but I will nevertheless suggest that optimally these things would be distributed as source on GitHub, with a deterministic build system guaranteed to be able to reproduce binaries in the future, and only secondarily as binaries.
I build them on AWS now, mainly since it takes me 5hrs to download the source compared to a few minutes on AWS. Anybody who knows of a better cloud building service? I consistently fail to estimate instance costs.
I am, and it doesn't cost much at all just sometimes there's problems with builds you have to manually track down I'd rather pay a flat monthly fee somewhere and get x amount of builds/bandwidth if possible. Build time is around 14mins only + the few minutes to repo source + few seconds to run my script that yoinks the junk from Android and the kernel.
This is why I have stopped believing that "The Year of the Linux Desktop" would even be a good thing.
Android has "regular old joe sixpack" users by the millions, but has the Android community actually benefited from that? The existing "desktop linux" community has their act together far more than the Android community, despite (or because of?) not having those legions of unskilled users.
Benefits have come out of it. It's not about the number of users who have it, but about the money companies are backing the users with. Linux has gotten some great new features that benefit the ecosystem as a whole that came out of Google's work on ChromeOS and Android. dm-verity is a good example of this.
Hardware support specifically is often cited as one of the ways that "desktop linux" would benefit from joe-sixpack users. That is what has always been lacking from "desktop linux".
Hardware support in official ROMs from phone manufactures is good of course, but what is freely available to the Android community is much worse. It turns out joe-sixpack doesn't really give any shits about hardware support being open-sourced.
It's nice that Google has given back some stuff, but that's peanuts compared with what the "year of the linux desktop" meme promises.
I bricked my Samsung GS3 (SPH crappy one) doing just that. 8 hours off-and-on of random binary / flashing and on the 9th hour, bricked. Horrible, horrible experience.
I wish the Internet Archive had an embargo feature, where you could push data in and while it wouldn't be served, it would be stored until a later date when copyright issues could be worked out.
It would work kind of like an escrow, right? Where the "release/publish clause" could be tied to copyright expiration or other legal events like you say.
The problem I see is, how do you filter/reject the amount of stuff you would have to store without even being public? AFAIK, the IA has issues storing so much of the already-public data, so storing "dark" (non-public/unpublished) stuff would potentially mean lots of cruft and garbage so to speak.
However it does sound very interesting. I hope we can some day achieve truly permanent data storage systems were we could just dump all of this and not worry much about it again.
Edit: Thinking about it a bit more, how feasible does the following sound:
Anyone interested in helping the IA could buy a sort of Drobo/NAS that is able to store only IA stuff (ala Freenet). Everything is encrypted of course, and then only way to access the files is when the escrow trigger fires off, the private key is released at the IA archive and then every owner of the IA-box will have access to that particular part of the archive (as well as regular IA users through web).
It's kind of like an HDD preloaded full of torrents, and then the differences or new additions can be streamed to your local IA-box as needed. You could even filter what kind of stuff would you like to help the IA archive. For example, I'm a big fan of movies so I prioritize that category (up to a certain % so that no one category is forgotten).
Does anyone know if anything resembles this? I mean, I could very well leave a low-powered NAS to help the IA serve their content, store it for later use, etc. And I imagine (hope) that a lot of other people would too. It would be a way of donating electricity, space to a worthy cause.
I certainly would hope so. However I'm not sure who is going to survive the longest: an organization focused on that specific goal, or a company that might change focus or go bankrupt altogether. Even though AWS is a pretty big part of Amazon (I'm guesstimating here, might be wrong), I would hope the IA would last for longer or have specific strategies to address different catastrophic scenarios that would overall make it more efficient/safe than AWS.
Well not pure profit because the servers cost money. The customer base is so large now that they have way more capacity than they would need for themselves.
The matching of Google's prices when they released was a good indicator of how much was profit though.
> Does anyone know if anything resembles this? I mean, I could very well leave a low-powered NAS to help the IA serve their content, store it for later use, etc. And I imagine (hope) that a lot of other people would too. It would be a way of donating electricity, space to a worthy cause.
The Internet Archive serves torrent files for every object they store, so in theory, anyone could have a NAS that would crawl those torrent files and then join the swarm/seed for all of those objects.
That would be a hell of an open source project. The Internet Archive would then be a metadata repository and seeder of last resort. #shutUpAndTakeMyMoney
Isn't that already the case? I heard that they already download well-tagged music from private trackers like what.cd which could be published if its copyright expires eventually. There's a page on archive.org with "what_cd" in it's url and no public items: https://archive.org/details/what_cd
Would be more interesting if copyright on anything created after Mickey Mouse would ever expire -- and unfortunately it doesn't look like it ever will.
the same is true within the styles of music that I like. It was compounded by the fact the other centralized file hosts, like Rapidshare and Mediafire, were forced to delete TONS of content on the same day that Megaupload folded.
There are many obscure demos, rehearsals, etc. that disappeared from the internet and most haven't reappeared since. I knew some bloggers who had uploaded probably in the range of 5000-10000 old metal demos, and these guys were careful to not post copyrighted material, but it seemed like if they got even a single strike against their account, everything was deleted.
I hope someone imaged those servers, otherwise a lot of that content might be lost forever.
I wonder if it's feasible to build a hosting site that strictly forbids (and actively removes) copyright infringing material, intended more for this kind of file hosting than for piracy.
Monetising would be a challenge, since rapidshare et al mainly made money by selling premium accounts, which are generally only attractive for piracy purposes.
Both require registration and are far more than just file hosts. The thing I like about rapidshare and other file hosts is that you can just drop a file on it and a minute later get a link. No thinking or effort required, no activation emails (or endless onboarding emails for that matter), no passwords to remember.
There's no such thing as "copyright infringing material", it depends on how has uploaded it. If I download an awesome ROM from megaupload using the link shared by its creator and upload it myself to megaupload to get my own link, the very same file is both infringing and not infringing copyright.
This is what people who were appalled that megaupload would only remove one of the links to a deduplicated file, instead of removing the file itself, failed (or refused) to understand.
I think it was never very optimal to share roms through megaupload, but it seemed to be the easiest option. I'm happy to see more rom devs starting to use the needrom.com. They have rom hosting as well as commenting/rating system, it's not perfect but at least they have lots of android roms available.
The second and third albums I bought in my life were from mp3.com. That site had a huge impact on where I am today (teaching music, playing in a band). The people who bought it had a responsibility to keep that music alive. It's a shame they didn't. Some of that music is lost forever.
Nothing guarantees that anything keeps serving files forever. BT at least distributes responsibility and allows anyone to easily keep the files hosted.
I've often wondered if Freenet, or a similar encrypted network file system would be a viable solution. This, of course, requires a significant number of computers to contribute full time.
AFAIK Freenet is basically a cache with no incentive system, so content will expire. There are some cryptocurrency-oriented storage projects but they all seem to be in way over their heads.
I think the fundamental economics for a cryptocurrency approach don't lean towards preserving obscure things that don't interest many people. Storage cost is always going to be proportional to the amount of data being stored, but income is always going to be proportional to the number of people who want the data.
In Freenet content will expire unless it's accessed regularly. If you want to make sure your files are around, just poll them every other week (which may or may not be already supported by the software), which is a simple "proof of work" scheme:
As long as you care, the files are reasonably safe to be around. When you (and everyone else) stops caring, they'll slowly disappear in favour of more popular stuff.
the point is that a service like rapidshare doesn't cost the user any money (and instead, get ads). Running a torrent server is expensive, and require dedicated hardware too. You can't really seed from the webhost after all.
The user can use seedboxes, they're far from being as difficult to use as dedicated hardware. They sometimes have some free tier that I expect would be more than enough for a team like xda-developers (think 5GB). It's true that they wouldn't be able to earn money through this, though.
You can seed from regular webhosts - as long as it's a regular http query to a regular http server, not some temporary http query hidden behind several layers of interstitial ads for the premium account.
I'm talking about things like small tools that were shared on e.g. xda-developers before Github came, about fan-mods for games etc... The 'big' ones continued living, but if you now e.g. search for a special kernel/ROM for your G1/ADP1 you are mostly out of luck.
It's sad that there's basically no way for an organisation like archive.org to archive things from sharehosters given the unclear (or quite clearly black/gray) law situation and also the missing cooperation with the sharehosters themselves.