Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Adventures with ffmpeg and color ranges (facebook.com)
177 points by doppp on May 29, 2019 | hide | past | favorite | 91 comments


Not sure if this kind of meta-comment is appropriate, but I find it difficult to follow a link like this because of the lack of human readable information contained in the URL.

If the story was hosted on a personal blog I recognised (like Julia Evans to take a random example), then perhaps the domain would be enough for me to choose to read the article to see if the content is interesting.

If the link was to a broader domain that hosts many writers, I might use the parts after the domain to filter my interest: example.com/news-opinion vs example.com/science-blog gives me more information that might not be clear from the post title.

With a site like Facebook, which hosts so much different value levels of content, it is harder for me to visit facebook.com/233555323 rather than facebook.com/ffmpeg-interest-group/post6665443. For whatever reason, the URL remains something I use to filter my internet consumption.

Perhaps it has something to do with the amount of scams or other general reasons not to trust non-human readable URLs, or perhaps it is more due to my distrust of article titles (sensationalism) and seeking a second source to verify my interest. Perhaps I simply bias against Facebook - I don't know.

Regardless, I visited the story after thinking about this, and was very surprised to see it was an article by John Carmack - an author I certainly respect. It is a shame for me that both the URL and the title don't reflect that.


Another site that comes to mind is the Financial Times, which uses UUIDs for article urls, with no other indication of content.

https://www.ft.com/

https://www.ft.com/content/13be9132-8149-11e9-b592-5fe435b57...


Oh wow, you really have to hope these are not the canonical links. If so, they are doing themselves an incredible disservice and the fact that no one has brought it to management's attention speaks gravely to the workplace culture there.


Not only does the full URL not show in my browser (Safari), the full URL doesn't even show on this forum (for me it is cut off after "content"). When you receive a URL via chat or on social networks, it is in the form of a preview and might not even show the domain name. The user also isn't typing these URLs in, with or without human readable text in the path. Essentially all of the arguments for making URLs more brittle (which is what you are doing by adding human readable text) are obsolete in a world where URLs have been relegated back to what they were always supposed to be: permanent identifiers to a semantically stable unit of content.


> Essentially all of the arguments for making URLs more brittle (which is what you are doing by adding human readable text)

Not if done right, like e.g. stack overflow, see https://stackoverflow.com/questions/517355/string-formatting...

The human readable text is actually meaningless, and anything after the id will redirect there, e.g. https://stackoverflow.com/questions/517355/this-could-be-any...


This technique (which I almost commented on in my original comment, but didn't) is brittle in a different way (making it so that attempts to search for the URL have no well-known canonical representation), and quite simply totally undermines the entire argument being made (which is why I didn't comment on it) that someone can use the slug of a URL to decide whether or not they want to click on the content: you can change the URL to say anything you want. At this point, there is no value in bothering with having done this: it is just extra text in the URL for no concrete benefit.

meta.stackexchange.com/questions/148454/why-does-stack-overflow-use-title-slugs/


> very surprised to see it was an article by John Carmack

John Carmack works for Oculus/Facebook, so that's probably part of the reason he posts on Facebook, but you might mean you were surprised because the URL gave no indication.


The title actually used to say Carmack, so I guess a HN moderator removed it...


Stackoverflow solves this issue like so: stackoverflow.com/questions/<question_id>/<readable_slug>, ex: https://stackoverflow.com/questions/56340579/i-could-not-set... Interestingly, you can change the slug and still end up on the same page, which just shows how simple a system it is. I'm sure there are trade offs but I always thought it was a clever idea (for its simplicity).


Isn‘t a quick glance at the article enough to decide whether or not to read the article? It‘s not like you are forced to read the article after all.


I think the point OP is making is that they won't even click the link in the first place.


Funnily enough, I actually ended up giving myself a crash course in this very topic when encoding frag movies for Quake 3/Quake Live. It was/is common practice to use 'brightskins' on your enemy models in competitive play to make them easier to see, typically being a colour like '#00FF00'. I noticed that when uploading to online media hosting sites like Youtube these bright colours would become noticeably washed out and degraded. My investigation of why this happened led me all the way down this rabbit hole. This was the better half of ten years ago now so I have no idea if it still occurs, but I ended up just rendering the videos using a better brightskin colour that appeared less degraded after conversion... Going through the cycle of 'rerender/reupload/check' to find a replacement colour again and again to experiment sure was a major pain!


For all of our sanity, please don't start producing 8-bit H.264 videos with full 0-255 range.

If you have enough control over the decoder and renderer to make that hack work, you should just be producing 10-bit HEVC/AV1 video – which does not suffer from these 8-bit banding issues – with a modern color space (Rec 2020, PQ, etc.).

And if you don't have choice or control over the decoder and renderer, then it's a good sign that this would be a fragile hack.


My primary concern is for immersive VR media on our Go and Quest platforms, which use Snapdragon 821 and 835 SoC, respectively. The 821 doesn’t support 10 bit on anything, and the 835 only supports it on h265. Software decoding a 4k60 video on mobile isn’t an option, and you can see every bit transition with dark adapted eyes in a vr headset, so getting full 8 bit range is pretty valuable.


You forget to mention which colorspace you're using. It matters.

From Rec. 2020 wikipedia article: Since a larger color space increases the difference between colors an increase of 1-bit per sample is needed for Rec. 2020 to equal or exceed the color precision of Rec. 709.

The "correct" solution is to use a modern colorspace (rec 2020) with 10 bits encoding (or 12 bits, if you can). UHD standard mandates at least 10 bits.

On the '821 you could gain 1 bit by degrading to an older color space - 709.

Full 0-255 range should never be used - they are invalid according to the standard. None of the hardware codecs are validated for 0-255 - you're asking for trouble here. Expect ghosting and weird motion artifacts accumulating from one I-frame to the next I-frame.


Are you sure about that? Logitech sells H264-encoding cameras which do yuvj420p ( which is full range as I understand it ).

YT and other platforms have no problem ingesting these videos. Also the RPi Broadcom chipset does not seem to care.


So to clarify, you don't agree that anyone should use -pix_fmt yuvj420p when encoding video to share with others?

Let's say I have a video with full RGB range 0-255 video. Why would 10-bit be a better option than 8-bit?


Logitech C920 family cameras do exactly that. H264 full range.


> This is usually visible as ugly banding or blocking in dark scenes.

I was just complaining about this a couple weeks ago with some friends after the “Dark” episode of GoT. I was saying the codecs don’t seem to have caught up or they are just being fed at too low a bit rate to really excel at dark scenes.

It’s hard to imagine this is just a scaling issue with the color ranges, and if so, how could it possibly not have been fixed?!

I have a high end OLED screen in a very dark room, and I was very disappointed by the picture quality. Enough to wish I hadn’t watched it through HBO Go and that I had a better source.

I would love to see some before/after scenes showing the effect of the patch.


It's not; full-range YCbCr doesn't exist anywhere in the "pro" or broadcast pipelines that would have been used for GoT.

I think he meant to say that the limited dynamic range of 8-bit video manifests most obviously in dark scenes, since that's where standard gamma curves are the most imprecise relative to our vision (which was fine back when TVs couldn't reproduce that range anyway.) The 14% from full-range doesn't help that much; really you need the 4x finer precision you get from 10bit.


A) Having a different color space is a violation of spec (H.264 profile's Baseline, Main, and High all define yuv420p). Don't expect your decoder to present these properly.

B) The issues you're seeing with background banding are _not_, primarily colorspace issues. While having a larger colorspace helps, these are largely related to the way quantization is done for the macro blocks in h.264.

The original h.264 spec (which is baseline and main profiles) only had a exponential quantization matrix.

During quantization we are "rounding" to the nearest integer for pixels. If we imagine each macroblock only had _one_ frequency coefficient. Dark scenes would have a low coefficient, say 1.4 If we round to 1 that's a 28.5% error in the rounding. A brighter scene which is at 9.4 has a rounding error of only 4.3%.

Custom quantization matrices and non linear quantization matrices(which are available in newer codecs) help deal with this.

Advanced h264 encoders have variance adaptive quantization technologies to help choose what level of quantization to apply to a block.

In short: h264 has technological limitations that are not caused by colorspace and the encoder doing this encode doesn't have strong AQ.


The Amazon 720P version had noticeable banding several times during that episode that was distracting even the day after so it's probably not a bandwidth issue, the Dothraki charge into the dark might have been a cool idea on paper but what a disappointment in execution. I think the director and cinematographer are just ignorant of how stuff gets delivered to consumers.

https://news.avclub.com/was-game-of-thrones-the-long-night-t...


From a tweet linked from that article:

https://pbs.twimg.com/media/D5SmDG7WwAAMhdf.jpg:large

I think that particular screenshot makes it look even worse than it was, but it was definitely sub-par. I assume that a hardcopy 4K Bluray would not exhibit the same effects?


>I have a high end OLED screen in a very dark room, and I was very disappointed by the picture quality.

Unfortunately for you, high end OLED in a very dark room is literally the most likely conditions under which you'd be annoyed by these problems... :-)

HDR is the way to go.


It’s Dolby Vision from the ATV to the set, it’s the source encoding I assume that just couldn’t live up to material.

Perhaps bitrate compromises have to be made when you’re streaming to so many people at once, but it’s not usually so glaring.


Oh, that's surprising. If it is truly Dolby Vision then the dynamic metadata should cope very well with darker scenes, so it must be either the source material or the compression for streaming.

It will be interesting to see if it looks better when a higher-quality Bluray version comes out.


Interesting find.

I have encoded video for over a decade, and I don't think a two month period has gone by where I haven't learned something that convinces me to completely change how or what flags I'm using.

This is way beyond that but still.


At the risk of telling you something you know, there's a lot more things to trip on with proper color space conversion than just going from scaled to full range color in 601. It's something of rat's nest to cover all the cases.


After ProRes became a thing, there was a lot of confusion where the metadata saved within the MOV started to include the HD vs SD color primaries. Some programs were able to correctly read this data. Most did not. Most programs did not include the data including Final Cut. The iTunes team started rejecting files exported from FCP because they claimed they were not made correctly and not a valid file. The iTunes team were just reading the spec and built their automation based on that, but not what the real world output from FCP was doing.


I still find it kinda odd that this is still the best tool out there for this sort of work. Archaic parameters and options and exceptionally hard to read documentation.

Maybe I'm just dumb and not in the domain enough to get a decent handle but everytime I use ffmpeg I feel like a Norse Shaman trying through trial and error to locate the correct orientation of runes for the blessing.


there are two problems here. the main problem is that nobody reads the manual, which actually starts out half-decent. the description is very clear, concise, and even contains the oft-whinged-about examples. https://ffmpeg.org/ffmpeg.html

the second problem is that ffmpeg has a lot of features, and nobody wants to go through and fully document hundreds of formats, codecs, and filters.


Part of the problem is that the documentation is so vast (the page you link is only one of many, many pages; the documentation has been split out into multiple pages for a while now) that as a user, it's fairly insane to try and consume and comprehend it all, especially if all you want to do is something quick.


The third problem is that video is _complicated_. What does the nal-hrd=cbr do? Well you need to understand the HRD model and (which is about 5 pages of dense specs).

Ffmpeg does a great job of guessing _what_ you want. But as soon as you're outside of those guesses you need to know _a lot_ to get things working.


I'll be adding entries for all the missing demuxers/decoders/encoders/muxers eventually. At present, only a fraction have been documented.


I wonder how Avisynth is doing now. In the early 2000s it was my tool of choice for video conversions.

It used scripts, and it required external tools to do the encoding but despite the complexity, I found it less arcane than ffmpeg command lines for similar operations.


I've used both. AviSynth 2.6 is really long in the tooth (honestly it's kind of a disaster) and 3.0 never happened. AviSynth+ seems interesting, but I don't know anyone using it outside of piracy communities.

Once I got better at understanding FFmpeg's filters there really was no looking back.


Not sure how true it is, but I read somewhere that the reason it's so good at what it does is because it's used at scale by certain large websites (youtube being specifically mentioned).


You've got cause and effect backwards there (mostly; it goes both ways).

Disclosure: FFmpeg contributor and Vimeo employee.


I don't know of any other tool that does what it does. I used it myself when handling video and needing to re-encode or fix up output captured from elsewhere. Its a swiss-army knife of video operations.


Handbrake is a popular GUI on top of ffmpeg.



http://archive.fo/K5Fpv

> .IS domain will likely stop working soon[0]

[0] https://blog.archive.today/


Thank you. I really didn't want to go on Facebook.


TIL: EE, a mobile provider, blocks archive.is as unsafe content for people under 18. https://twitter.com/nvartolomei/status/1133644305812381696


That sounds correct to me, since it could contain NSFW content, and by default, mobile providers in England have content controls switched on.


The ffmpeg documentation for YouTube videos[1] suggests using `-pix_fmt yuv420p` with H.264 and there are warnings against using other pixel formats (such as yuv444p) for compatibility reasons. Should the documentation instead be suggesting `-pix_fmt yuvj420p`? What are the risks associated with choosing that output format?

[1]: https://trac.ffmpeg.org/wiki/Encode/YouTube


No, limited range is the most common and expected range for YUV videos. Many players don't look or respect the container metadata such as nclx in MP4 or the bitstream VUI flags and assume limited range.


This is correct. In fact, the following statement from the post is nonsense:

> For arcane reasons related to TV broadcast limitations, many video formats restrict the YUV color components to be in the 16..235 or 16..240 range instead of the full 0..255 range. Losing 14% of the already-barely-enough 8 bit dynamic range is bad enough, but it also often results in the black range starting at a quite visible grey value because most players don’t rescale the range. This is usually visible as ugly banding or blocking in dark scenes.

Of course players "rescale" the range. (It's not exactly rescaling, it's just that a value of 235 IS reference white in Rec. 709.) In fact, the thought if them not doing this is kind of crazy, as Rec. 709 (used in Blurays and elsewhere) as well as Rec. 601 explicitly use limited range YUV. Players that don't use limited range correctly would be out of spec and couldn't play a single Bluray disc correctly. So for the overwhelming majority of use cases, you want regular old yuv420p.

The author is picking up on something that actually happens, however. There are Blurays out there that have incorrectly converted YUV values. This happens when for example something is in limited range YUV, and then that data is passed to a converter that treats the data as full range YUV, and compresses it again to fit in limited range YUV. This "doubly compressed" data is embarrassingly common (maybe 5% or so of Blurays have this problem). If you get a Bluray like this, or any video with the same issue, the blacks will be bright grays and the whites will be dull (similar to how the author describes limited range YUV). The player on the end is reading the YUV values correctly, but the data has been doubly compressed so it only undoes one round of this.

The issue is well known in the pirate communities as an example. Reliable release groups always fix these broken Blurays in their encodes. This is usually described as "fixing levels" or something like that.


Interesting post!

I have one question about terminology though: when you refer to “players“, do you mean all kinds of players or just physical hardware devices like Blu-ray players?

I ask because I haven’t had/observed any issues or defects with colour in software/PC-based desktop serious at all.

Disclaimer: not in the Blu-ray-authoring business.


It’s pretty easy to verify: create a uniform white png and convert it to yuv420p h264 via ffmpeg:

ffmpeg -i input.png -pix_fmt yuv420p movie.mp4

Open in VLC or your favourite player and take a screenshot. Now check with a color picker whether you have white as 255 or 235.

Update: checked with VLC and QuickTime Player, both handled it correctly and displayed white as 255.


I'm referring to both. It's more likely, I suppose, that software players could have problems in this area, but that's simply because many of them are free and they don't have to pass any kind of certification process. As a matter of fact, all well known software players I've encountered seem to do this right. Regular limited range YUV works pretty much everywhere. Doubly compressed YUV (the encoding mistake I was talking about) works nowhere because the data in the file is incorrect and has to be manually filtered to repair the damage.


Wow! Talladega nights was the first movie I saw on a Bluray setup. (PS3 at a friends.) I was shocked at how badly mastered it was. It sounds exactly like that problem. Everything was washed out.


Yeah, that was one of the first Blurays that ever came out, as I recall. Another issue that you often see with those early ones is that they're actually encoded with the MPEG-2 video codec (H.262), same as DVDs. That's allowed under the Bluray spec, but of course anything released these days is going to use H.264 with vastly better picture quality as a result.


I didn’t know about MPEG2 in Bluray. Still, with higher bitrate?


This.

Carmack is in the wrong here.


I see a lot of people confused about the host (Facebook). Carmack initially posted a lot of high quality articles on AltDevBlogADay. However, that site went down. Carmack said he moved to Facebook because he feels it's more reliable than other (cleaner) third-party hosts, which might not be around 10 years from now.


I've seen a lot of similar posts hosted on Google Plus and shared on HN in the past. They are all gone, now.

Sure, Google has a horrible track record when it comes to service durability, but I wouldn't trust Facebook as a reliable host either. Anyone having relied on their @facebook.com email address (2010-2016) may agree.


I just happened to find where he said that: https://news.ycombinator.com/item?id=17068101

"Years ago, I felt burned when I wrote several articles for #AltDevBlogADay, and they vanished. I have much more confidence that what I write on FB won't vanish. [...]"


also see "Falsehoods programmers believe about [video stuff]": https://haasn.xyz/posts/2016-12-25-falsehoods-programmers-be.... previously discussed: https://news.ycombinator.com/item?id=13259686.


John Carmack 7 hrs ·

Adventures with ffmpeg and color ranges.

A video legacy issue that can cause a lot of problems is the issue of “limited” versus “full” component range. For arcane reasons related to TV broadcast limitations, many video formats restrict the YUV color components to be in the 16..235 or 16..240 range instead of the full 0..255 range. Losing 14% of the already-barely-enough 8 bit dynamic range is bad enough, but it also often results in the black range starting at a quite visible grey value because most players don’t rescale the range. This is usually visible as ugly banding or blocking in dark scenes.

For ffmpeg, the trick to avoid this is to use ‘-pix_fmt yuvj420p’, which says to use the j-for-jpeg full range in a 420 YUV subsampled p-for-planar format.

If you are starting with either RGB images, a 10/12 bit format, or a yuvj420p format video as input, then with the libx264 codec, you would get a full range output. Note that any video processing tools used along the way could also limit the range, and once it is gone, there is no getting it back, so you must be very careful and check your entire pipeline!

When you us this format, ffmpeg complains about ‘deprecated pixel format used, make sure you did set range correctly’, but you should ignore this warning.

Ffmpeg would like the world to move to specifying the range independently from the YUV channel subsampling and layout:

Setting ‘-color_range 0 -pix_fmt yuv420p’ makes the output format yuv420p Setting ‘-color_range 1 -pix_fmt yuv420p’ makes the output format yuv420p(tv) Setting ‘-color_range 2 -pix_fmt yuv420p’ makes the output format yuv420p(pc)

Unfortunately, this isn’t yet uniformly handled throughout all the internal format tests.

The libx265 integration in ffmpeg didn’t support the deprecated yuvj420p pixel format, only the basic yuv420p one, and no matter what I did, my test videos were always coming out limited range. It didn’t matter if you add a ’-color_range 2’, or ‘-x265-params range=full’. Those will change the settings in the VUI (Video Usability Information) section of the output, but the values are still compressed to the limited range.

Regardless of the input data, any 8 bit h265 video coming out of ffmpeg was limited range!

I walked through the libx265 code looking for range compression, but it turned out that all the damage was being done by ffmpeg before it got to x265. Ffmpeg will automatically convert formats when the output differs from the input, and since x265 only supported yuv420p, any full range input will be processed.

Adding ‘-v 48’ to ffmpeg will dump more information, including this auto_scaler invocation, which is what is killing the full color range: [auto_scaler_0 @ 000001f55ecd1a40] w:2048 h:2048 fmt:bgr24 sar:0/1 -> w:2048 h:2048 fmt:yuv420p sar:0/1 flags:0x4

I was about to start hacking the code to at least do what I wanted for my use case, but I tried an appeal to Twitter:

https://twitter.com/ID_AA_Carmack/status/1131715388067274753

My suspicions were confirmed, but Derek Buitenhuis went ahead and submitted an official patch to get yuvj420p accepted by libx265, and windows builds are already available at https://ffmpeg.zeranoe.com/builds/.

I suspect there was probably some way of working around this involving explicit format conversion filters with -src_range and -dst_range overrides, but this is now working as you would expect it:

ffmpeg -i source.mp4 -c:v libx265 -pix_fmt yuvj420p dest.mp4

Bravo to ffmpeg!



Fun fact: sRGB also has a non-black black point that is (fortunately) virtually ignored by everyone.


The sRGB black point just means that (0,0,0) is assumed to have a luminance of 0.2 cd/m^2, because your screen reflects ambient light and other factors.

Ignoring it will give you nonsensical values when you try to do conversions into other color spaces or when you try to do color comparisons.


Source?

As far as I can tell in XYZ black is (0,0,0) and the only sRGB value that maps to that is (0,0,0)


This is incorrect.

If sRGB (1,1,1) is XYZ (1,1,1), then sRGB (0,0,0) is XYZ (0.0025,0.0025,0.0025).

Or if you are using absolute XYZ values, sRGB (0,0,0) is XYZ (0.1901,0.2,0.2178).


Its weird to think how John Carmack went from .plan files to Facebook Posts. I think it might be safe to say he had the most widely read .plan file in existence. Wish he would go back to it.


Yes I wondered this also. It is naturally to use his employers platform. However I wonder if in 10-15 years this post will be lost in Facebook changes, but the .plan files from the 1990s will still be available somewhere.


Just as the .plan files were mirrored by some enthusiasts, so too will his Facebook posts.


Are they mirrored already? I don't want to visit facebook.



Maybe he was asked to publish all his communication on Facebook as part of his career move. Along with a large compensation. Unless he really believes Facebook is the right place to post this kind of things nowadays, who knows?


I suspect it’s not his decision to make.


That's totally absurd. Luminaries like Carmack are anything but prisoners; any firm in SV would pay big bucks to say "John Carmack works here".

Facebook may be a big conglomerate these days, and they may not be the cool kid on the block anymore, but that doesn't mean that they're idiots. If anything, they treasure the presence of someone like Carmack all the more, because it's no longer reputionally-free for people like him to come on board. I'm sure that Facebook is appreciative when he blogs on their platform, but it's patently ridiculous to pretend that anyone would -- or could -- force it.

In fact, Facebook has already been forced to pay at least dozens of millions of dollars essentially as a direct consequence of employing Carmack [0], and I'd be willing to bet they'd happily pay (at least) dozens of millions more.

If you want your mind blown even harder, look up some of the paperwork around the Waymo / Uber debacle [1]. SV companies pay big bucks for this type of talent, even the ones you haven't heard of, like Anthony Levandowski.

If the roll of the dice is in your favor, SV stardom is at least as valuable as Hollywood stardom (and, for the most part anyway, you still get to walk around in public unharrassed).

Disclaimer: I have no special connection with any of the companies or people mentioned, I don't know the inside baseball, and this is all conjecture. But at least it's better supported than yours. ;)

[0] https://en.wikipedia.org/wiki/ZeniMax_v._Oculus

[1] https://www.cnbc.com/2017/04/03/waymos-uber-lawsuit-reveals-...


Picturing The Zuck leading two mutant mandogs on leashes: the audience is horrified to realize they're Yann LeCun and John Carmack, though 'heavily modified' and almost unrecognizable in their snarling ferocity.

"This fist holds the Ego. This fist holds the Id. I bring you a new existence: the GlobalCoin. I came up with that myself. As Super-Ego."


I appreciate your imagination.

[not the person you responded to]


I don’t mean to say he’s “held prisoner”, more that the company culture is to post things using the company’s platform, and he’d rather not stick his head out too far from that path. (Which I guess could be seen as “him making a decision”. But it’s not a decision made in a vacuum.)


Hero worship aside, in the grand scheme of things Facebook, Carmack is of almost 0 importance.


Sure, but in the grand scheme of things Facebook, whose market cap currently sits at 526 billion dollars, the correspondingly inconsequential sum of something-something-millions of dollars is worth the potential benefit.

In exchange for their patronage, employers to the world's technical stars get to use their name in recruitment and other types of marketing, and they're well-positioned to profit from the star's next great achievement, if indeed one ever occurs. This is very similar to the benefit to a record label or a movie studio in signing a deal with the current one-hit wonder or starlet of the hour.

As in any employment transaction, there can be no true guarantees. None of us can promise that we'll be here tomorrow. But the value of investments in strong personal brands has proven profitable enough over the last several decades that many companies are eager and willing to make them. Entire industries are built around them. I don't think it's going to stop any time soon. cf. the Lindy Effect [0]

There is surely some limit to Facebook's tolerance of a single man's foibles, regardless of star power. I would bet my bottom dollar that "blogged off-Facebook" is nowhere near it.

[0] https://en.wikipedia.org/wiki/Lindy_effect


I don't think the statement is entirely unreasonable given that JC is a facebook employee, he's one of the tech masterminds behind their VR initiatives and the optics of having the tech face of Occulus dog-food your social media platform is important to at least some of the brass there.


John Carmack can blog about ffmpeg bugs wherever he wants. Like there is no way stopping him from doing that is even remotely legal. He probably just posts on facebook because it's convenient and reaches a lot of people.


I distinctly remember him saying he was only creating a facebook account because Oculus was bought by them. I strongly doubt he would be on there at all if it wasn't for the acquisition.


I don’t mean that he is legally bound to the platform. Just that the company culture is to use the platform, and he may see not using the platform as making a statement he does not want to make.


You think someone is telling Carmack where he can and can't post? Why one earth would they do that? And why on earth would he put up with it?


More that the company’s culture says you should post things using the company’s platform, and to not do so might be making a statement he doesn’t want to make.

Otherwise, why would he move away from .plan?


You said 'it wasn't his decision to make' not 'perhaps he wants to use his employer's platform'. If you don't want to imply it's coercive, you should phrase it in a way that doesn't imply it's coercive.


I believe it’s a “perhaps he feels he should be using his employer’s platform”. A murky coercion middle ground.


Ok, but "I don't think it's his decision" is then a weirdly overwrought (or less charitably, thoughtlessly provocative) thing to say. It's totally his decision. Nor do I think anyone at FB would be stupid enough to tell him where to post random ffmpeg things. He's also active on twitter - as you can see in the post we're discussing.


Could it be that a lot of people don't know what a .plan file is, or how to read content from one?


If John Carmack started updating a .plan, people would figure out how to check a .plan.


It doesn't have to be coercive. Some of us would like to know what the people who hire us are up to even if we don't personally have any interest in it.


The best part about these blog posts is that people discuss the hosting service more than the content of the article at this point.

No one is nearly as interested in Carmack's thoughts on codecs as they are about the domain name of the server he's posting to.

And yet, we'd all read his stuff no matter where he puts it. But maybe he reaches more outsiders this way? I doubt it.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: