ITU Approves H.264 Video Standard Successor H.265 182
An anonymous reader writes "The H.265 codec standard, the successor of H.264, has been approved, promising support for 8k UHD and lower bandwidth, but the patent issues plaguing H.264 remain." Here's the announcement from the ITU. From the article: "Patents remain an important issue as it was with H.264, Google proposing WebM, a new codec standard based on VP8, back in 2010, one that would be royalties free. They also included it in Chrome, with the intent to replace H.264, but this attempt never materialized. Mozilla and Opera also included WebM in their browsers with the same purpose, but they never discarded H.264 because most of the video out there is coded with it.
MPEG LA, the owner of a patent pool covering H.264, promised that H.264 internet videos delivered for free will be forever royalty free, but who knows what will happen with H.265? Will they request royalties for free content or not? It remains to be seen. In the meantime, H.264 remains the only codec with wide adoption, and H.265 will probably follow on its steps."
So who won? (Score:2)
Several companies made proposals for what would eventually become H.265.
Who won?
Re:So who won? (Score:5, Informative)
Nobody "won". Companies weren't making proposals for complete replacements for h.264. They were making proposals for incremental improvements on h.264. h.265 is a collection of those different improvements. Each one is small in itself, but they add up.
Re: (Score:1)
Then whose patents have now become gold mines and/or roadblocks?
Re:So who won? (Score:4, Informative)
Then whose patents have now become gold mines and/or roadblocks?
The H.264 patent pool has 30 licencors and the list of patents is 59 pages long, so the short answer is: Most of the industry. Apart from Google with WebM and previously Microsoft with VC-1, there is surprising unity. My predictions are as follows: HEVC is as dominant in hardware as H.264, there will be an open source encoder like xvid/x264 and those who can't or won't use that will use WebM despite the somewhat larger size because Google will probably fight to back it as a free codec. Anything else will be never go anywhere outside geek circles like Vorbis or Theora.
Re: (Score:2)
As it stands, WebM is somewhat less effective than h.264, and as such, it will never be competitive with h.265... No more so than MPEG-4 ASP is competitive with h.264.
WebM completely failed to gain any traction whatsoever against h.264, so why should it do any better against h.265?
Re:So who won? (Score:5, Interesting)
You're being very kind by saying WebM is "less effective" compared to H.264. I'd put it closer to "why in the hell would I want crummy looking compression unless I use at least twice the data rate?" This from someone who's livelihood partially comes from putting compressed streams on the Internet. WebM isn't good enough and just got lapped again.
Re: (Score:2)
Well if you're rendering some cutscenes for a game and want a codec that is free to use and better than MPEG1 - MPEG2 and all newer ones are still patented - then WebM might fit the bill. I'll agree it doesn't take much to win that category though.
Re: (Score:3)
Re: (Score:2)
HA! True dat. And they accuse Apple of form over function!
Re:So who won? (Score:4, Insightful)
Google is already working on VP9, so they aren't giving up quite yet. Whether they'll manage to be competitive is another matter, but at least they're trying.
Re: (Score:2)
WebM completely failed to gain any traction whatsoever against h.264, so why should it do any better against h.265?
Well, if WebM were as good as h.265, then we'd be a in a place where no hardware supports either standard and new hardware could support both standards. Right now, h.264 has hardware support and WebM doesn't, putting it at a large disadvantage.
Re: (Score:3)
If it's all about Apple, then how do you explain Google quietly backing down from their earlier promises to ditch H.264 support in Chrome?
Re:So who won? (Score:4, Informative)
Anything else will be never go anywhere outside geek circles like Vorbis or Theora.
Please watch those overly broad claims. Vorbis is now well established in a number of niches, notably video game sound content.
Re: (Score:2)
Amazing how that works. Company does a bunch of research work, and then says "hey, you can use the research work we did if you pay us". It's almost like the people there are normal human beings who want to live and eat!
Re: (Score:2)
But if the ITU is going to make it a standard, it really should require patent grants for any patents that would cover that standard, and require all entities that are participating to sign agreements that they are agreeing that any patents they currently hold or will hold in the future will not be used against those implementing this standard.
Re: (Score:2)
It's funny how different Slashdot sentiment was when it was discussing Google as the plaintiff trying to get an injunction over standards-essential patents...
Re:So who won? (Score:5, Insightful)
Yeah, it's funny. Almost like there's more than one commenter.
Re: (Score:2)
Google is also in a patent nuclear war with Apple and some of those patents should be thrown out entirely. Like any war, you have to do dirty things to survive because you cant count on the other side being acting civilized.
Patents more and more seem to be nothing but a pointless burden. Both classes should be done away with (the bogus ones and the ones that standard depend on).
Re: (Score:3)
Except for rambus that's how it works
DVD, blu ray, 3G/lte and lots of other standards are patent pools
You donate your patents and agree on a small royalty for anyone who wants to use them
Re: (Score:2)
What would be better would be requiring a Free implementation of said standard, and if $BIGCO doesn't want to (or can't due to other obligations) make their resulting product Free as well, they can pay a license fee. Just like all the other dual licensed software/source out there....
Re: (Score:2)
What does that actually mean in English?
How should make a free implementation? And how do they do that when they have to pay royalties?
Who is $BIGCO? And who do they pay a license fee to? MPEG-LA?
Are you saying companies so be allowed royalty free use of patents if the resultant software is free? Quite different than requiring free implementation
Re: (Score:2)
That's a popular opinion, but it's not part of the ITU charter. The ITU (and ISO) make official standards that aren't free. The ITU/ISO don't have an open source/free only perspective. Nor do they disallow competing standards.
Re:So who won? (Score:4, Insightful)
"Company does a bunch of research work, and then says "hey, you can use the research work we did if you pay us"."
Translation
"Somebody does some research then a company patents it and every little incremental change every few years to keep old patents alive and to stop anyone else trying to enter the market" ...
"It's almost like the people there are normal human beings who want to live and eat!"
Sorry, I couldn't translate that with a straight face. Its even more laughable and insulting than when the RIAA says it.
Re: (Score:2)
Re: (Score:3)
> Company does a bunch of research work, and then says "hey, you can use the research work we did if you pay us".
Patenting and/or charging to do math is idiotic. The point of having a standard is that _anyone_ could read it, and implement a working version. Standards _need_ to be free else society literally pays the price of "progress ransom"
You don't have to pay a fee to write HTML, Javascript, etc. You shouldn't have to pay a fee just to shuffle numbers around - i.e. to encode video.
Re: (Score:2)
Very true, but I'd go as far as to say you can charge money to /encode/ video, but should have no say over /decoding/ it. Therefor, the person encoding can choose to use a better, paid-for encoder, or a free one. The user won't care really, because they can decode it either way.
Re: (Score:2)
That's a decent trade off; maybe a good trade off might be:
Encode at lower quality (lower bitrate) = free
Encode at higher quality (higher bitrate) = $
As soon as you force people to license the codec you just shut a ton of people off from adapting it which is completely counter-intuitive.
Re: (Score:2)
Standards _need_ to be free else society literally pays the price of "progress ransom"
In other words, unless it's free, then people who want to use it have to help pay for the cost of developing better stuff. Just, WOW. What a concept.
Re: (Score:2)
Codecs aren't rocket science, just basic computer science. (Oblg. "Brain Surgery" http://www.youtube.com/watch?v=THNPmhBl-8I [youtube.com] )
You don't have pay a license just because you want to implement (fancy) math on a computer, which is ALL a codec is.
If companies want to license their encoder for a fee I don't have a problem with that AS LONG as it becomes free (donated to the public domain for the benefit of everyone) in 5 - 10 years. This "disease of greed and screw the public benefit" needs to stop at some poi
Re: (Score:2)
Company does a bunch of research work, and then says "Hey, we're now going to force everyone in the world to use our research work and pay us to do that, even if they'd prefer to use someone else's work for cheaper or free"
I forget, are we for or against authoritarianism? Depends who's paying the corporate standard committees, I guess.
Re:Amazing how you twisted that. (Score:5, Insightful)
This adds a sunk cost to the barriers to entry into the device market, in favour of the established market dominators (which is what patents are all about), and to the detriment of free market, consumers and technological progress.
Re: (Score:2)
FAT/FAT32 isn't a poor technology, it's a simple technology. It's not very complicated, but the implementation has evolved over nearly 40 years.
Secondly, you don't have to pay royalties to Microsoft for using FAT/FAT32 itself. You have to pay Microsoft if you use the same exact algorithm for storing larger than 8.3 filenames on FAT. You are free to use a different algorithm, and not pay any royalties, or stick to 8.3 filenames as the original FAT/FAT32 did.
Re: (Score:2)
It's only been hacked to support larger disks and longer file names, and still it does that poorly (high internal fragmentation, small maximum file size, no support for extended attributes, poor performance on optical storage and flash, and let's not talk about missing features).
Hacked -- Improved. Those words are pretty much interchangeable depending on your own view and biases. Also, systems using FAT can use extended attributes if they wish. OS/2 does just fine with extended attributes on FAT, and so does cygwin. Just because FAT doesn't explicitly say this is where you stick them doesn't mean you can't write a file system driver on top of it that puts them wherever you want. Yes, FAT has poor performance on optical storage, but why would you use FAT on it in the first plac
Re: (Score:3, Insightful)
The point is that there is not much choice if it is part of an interoperability standard. You simply cannot view a H.264 video on the web with a browser that only supports WebM, just as you'll have no luck to watch NTSC broadcasts with a PAL-only TV. Of course you are free to try to sell that PAL-only TV in the US, but you won't succeed, not because it is bad (the same TV may sell like crazy in Europe), but because it doesn't work with US broadcasts.
You only have a choice if there are two options that both
Huh? (Score:2)
Mozilla and Opera also included WebM in their browsers with the same purpose, but they never discarded H.264 because most of the video out there is coded with it.
What? Firefox didn't have H.264 support until late 2012.
Re: (Score:1)
And I believe Opera still doesn't, except on Unix where it'll use the installed GStreamer plugins where available.
Re: (Score:1)
Mozilla and Opera also included WebM in their browsers with the same purpose, but they never discarded H.264 because most of the video out there is coded with it.
What? Firefox didn't have H.264 support until late 2012.
See? Since they didn't have it, they couldn't discard it.
In other news, lynx also never discarded support for H.264.
Re: (Score:3)
What? Firefox didn't have H.264 support until late 2012.
Unless you're talking about some non-release version of Firefox, it *still* doesn't have it. Though I think Firefox mobile does (not sure if it's the release version or not).
time to transcode again (Score:5, Funny)
Being a videophile, first I encoded everything to divx, then I transcoded to h.264. Now I suppose I'll turn them all into h.265 - it'll be the best quality yet.
Re: (Score:2)
Being a videophile, first I encoded everything to divx, then I transcoded to h.264. Now I suppose I'll turn them all into h.265 - it'll be the best quality yet.
A videophile maybe, but a clueless one indeed. You lose quality on each transcode, you don't gain any.
Re: (Score:2)
Hah, I missed the sarcasm tag. Oh there was no sarcasm tag. Nice troll.
Re: (Score:2)
Oh, that is not the way to go. You ought to see what you can do with VHS. Capture with MJPEG to Cinepak, then export to VCD. Rip that to Indeo 4, then convert to Indeo 5. Import into Adobe Premiere and run some filters on it. Export to SVCD. Rip and upconvert the SVCD into MPEG2 720x480 and export to a DIVX file. Convert to WMV with Windows Media Encoder. Import back into Adobe Premiere, add a few more filters, export to Quicktime. Upload to Youtube, using their video stabalizers and automatic filters. Use
Re:time to transcode again (Score:5, Funny)
Re:time to transcode again (Score:5, Funny)
This is the Slashdot QC and calibration department. Your yearly sarcasm and humor detector calibrations are due. Please leave the detectors in the tray by the door at the end of your shift.
Thank you.
--
BMO
Re: (Score:2)
This is the Slashdot QC and calibration department. Your yearly sarcasm and humor detector calibrations are due. Please leave the detectors in the tray by the door at the end of your shift.
FYI, I think he already did.
Re:time to transcode again (Score:5, Funny)
Article summaries are part of the Editorial Department, down the hall and to the left.
Coincidentally it's in the last stall of the washroom.
--
BMO
Re: (Score:2)
I can't wait until lossless video encoding becomes practical.
Why? The visually perceivable differences between the source and high bit/pixel H.264 are almost non-existent.
There are generally more differences between the actual source (film/captured video/etc.) and the adjusted-before-encoding (filtering, color-"correction", etc.) source than those caused by lossy encoding.
Re: (Score:2)
Repeatedly transcoding between lossy formats degrades quality each time. *You* may not perceive differences, but encoders do, and they tend to amplify those differences until they become very noticeable visual artifacts - no matter how many bits you use.
If you need to work with original content to edit it before the final compression for consumption, then you need lossless. But, there are plenty of lossless formats that will work for that, and some are compressed. If you dedicate 1TB or so to your edit workspace, you won't need to use a compressed format unless you are working with the entirety of a modern 2-hour movie. I learned this when I accidently forgot to choose a compression method when re-encoding some TV shows at 720p, and it took over 4 hours
Re: (Score:3, Insightful)
There are already lossless video codecs out there. Lagarith is a recent and popular one. The problem is that they only cut maybe 2/3 off your raw file size. Ten seconds of raw 1080p video is over a gigabyte. There's just too much information there -- you have to throw some away to get reasonable compression ratios. Waiting for lossless video to be as small as H.264 is like waiting for a 200MB download for a DVD-sized Linux ISO. Sadly, it's just not going to happen.
Re: (Score:2)
I don't think anyone is waiting for lossless codecs to get smaller, they are waiting for the hardware to get bigger. It happened to compressed formats for music in the 90s and video in the 00s, now the teens may start to see losslessly compressed formats rule.
Re:time to transcode again (Score:4, Informative)
The storage is already here - 4TB drives can hold a useful amount of lossless video. A 1080p video frame is around 6MB uncompressed, at 30fps that's 180MB/sec. If you want true 1080p60, that's 360MB/sec, or about 3 seconds a gigabyte. A minute takes 20GB, 1TB can hold 50 minutes. 4TB can hold 200minutes, or just over 3 hours worth of uncompressed 1080p60 video.
The big problem has been the bandwidth required - lossless video requires a ton of bandwidth - it's why 4K cameras use SSDs for storage - spinning rust cannot maintain sufficient data rate. Or why video editors tend to be the biggest users of RAID-0 (striping, no redundancy) storage.
And most cameras don't use lossless to begin with - a 4K frame quaruples the data rate (turning our 4TB drive into a still-useful 50 minutes of video storage), but we're talking about a massive 1.4GB/sec. The ever-popular RED cameras use SSDs, and proprietary REDcode codecs in order to be able to keep datarates down enough for an SSD.
Want to go lossless? You'll need to go back to film.
Re: (Score:2)
even with bandwidth having 3 hours on a 4 TB drive isn't going to cut it. I have nearly 1000 movies and tv shows on my 3TB drive. Once we have PB sized drives with GB/sec transfer rates then we will be talking.
Re: (Score:2)
I can't wait until lossless video encoding becomes practical.
Are you waiting for JPEG images to die as well? I bet you love them 100MB download per page websites too.
Re: (Score:2)
compression gets better to give the ILLUSION of better quality, but lossless is lossless
What a stupid comment ... of course, better compression algorithms can give better quality (than an inferior algorithm) for a given size - its not an illusion
Dhurum (Score:2)
Re:Dhurum (Score:5, Informative)
Nothing more than H.264 had. DRM is implemented at the container level, not the bitstream level.
BD+ (Score:2)
DRM is implemented at the container level, not the bitstream level.
BD+ in Blu-ray Disc muddies this a bit, as it allows transforming the decompressed image based on whether or not other authenticity checks pass.
Re: (Score:3)
BD+ in Blu-ray Disc muddies this a bit, as it allows transforming the decompressed image based on whether or not other authenticity checks pass.
Although "transform the audio and video output" is listed as an option of BD+, it doesn't work the same way as most humans would parse that description. Based on this [wikipedia.org], it's just another way to encrypt the full .m2ts stream.
If it actually altered the video after decompression but before output, it would be impossible to rip a Blu-Ray losslessly with that protection, as you would need to decode the H.264 stream, apply the BD+ operations, then re-encode those frames to put back into the ripped stream. Note t
Re: (Score:2)
If it actually altered the video after decompression but before output, it would be impossible to rip a Blu-Ray losslessly with that protection
Exactly as planned.
In addition, in order to alter the uncompressed data, it would require that every Blu-Ray player use exactly the same H.264 decoder with exactly the same options and only apply video alterations after BD+ is done with the data.
I was under the impression that the transformations didn't need to depend on bit-perfect output from the video decoder. Just guessing, but they could involve color space modification, rotation, flipping, cutting and pasting, bending (remember old scrambled channels from the VideoCipher II era?), and the like.
Re: (Score:2)
If it actually altered the video after decompression but before output, it would be impossible to rip a Blu-Ray losslessly with that protection
Exactly as planned.
Since this would effectively stop ripping, I'm pretty sure if it were possible while still letting Blu-Ray players play the movie, it would already have been done.
I was under the impression that the transformations didn't need to depend on bit-perfect output from the video decoder. Just guessing, but they could involve color space modification, rotation, flipping, cutting and pasting, bending (remember old scrambled channels from the VideoCipher II era?), and the like.
First, they have to be simple, because of the limited power of the BD+ virtual machine, so anything that involved serious memory moves would be out. Color and pixel value would be pretty much the limit.
Depending on much the decoding varies from the reference, there might be some seriously noticable artifacts, especially if the scrambing was enoug
Re: (Score:2)
DRM is implemented at the container level, not the bitstream level.
BD+ in Blu-ray Disc muddies this a bit, as it allows transforming the decompressed image based on whether or not other authenticity checks pass.
Blu-ray is generally pathetic and an altogether unpleasant experience for the user with slow startup and numerous unskippable ads and threats, just to name two deficiencies.
Re: (Score:2)
Go jack of in a flower pot anonymous wanker. HD-DVD would have surely been an annoying piece of crap as well, with Microsoft driving it you can count on it. The only difference is, we *know* Blu-Ray is a piece of crap which we would not have known, had it lost the fight and remained in obscurity.
I hate the dark cloud over software advances. (Score:2)
Re: (Score:3, Insightful)
Good thing software patents don't exist in most of the civilized world then.
Immigration (Score:2)
Good thing software patents don't exist in most of the civilized world then.
Does "most of the civilized world" offer asylum to refugees from regimes with software patents?
Re: (Score:2)
Mp3 (Score:5, Interesting)
Re: (Score:3)
Re:Mp3 (Score:4, Insightful)
Once a standard becomes good enough, people will hang on to it for a long long time. Why bother re-encoding a complete music library from mp3 even if vorbis/aac is clearly the superior codec? Apple has enough difficulties pushing aac through, and not many hardware producers are including vorbis support. I guess the same could be said for windows xp and desktop hardware.
MP3-files are small enough to be streamable perfectly well even on really slow connections, but video files ain't small. A 2-hour, 1080p video file with any kind of a remotely-acceptable quality will weigh in at 4GB+, and well, it sure ain't streamable over very slow connections. Not to mention the fact that bandwidth costs money. Ergo, any developments that result in higher quality at the same size or similar quality at a smaller size are certainly welcome, both for consumers and for content-producers.
As a thought-experiment, let's assume that this or that TV-series I was watching on Netflix weighed in at 1.5GB for a 1h episode, and I watched 15 episodes in a month. That'd be 22.5GB of data. Now, if the move to a new codec reduced filesizes by 5% we'd end up with ~21.4GB of data -- that's already one gigabyte in savings. Now, multiply this with e.g. 200 000 users, what do you see?
Apparently you don't remember it, but at one time, MP3 files weren't small either. I remember it taking about an hour to download a good quality MP3. And there was streaming, too. Things like Real Player provided lower quality, higher compressed versions that were more suitable for streaming. Then do you know what happened next? Did Real Player and stuff like it win out? Nope. I'll give you a hint...the MP3 files didn't get any smaller.
Connections got faster, and bandwidth got cheaper. Much like those days for MP3, today good quality h264 files are a bit cumbersome, but I can easily download them in an hour or 2 with a typical (not even high end) consumer level internet connection. And today there are ways to get lower quality, more highly compressed version that can stream a fairly good quality HD video in real time. Give it another 5 years and the problem will easily solve itself without replacing every single piece of hardware and re-encoding every existing file.
Re: (Score:2)
Really? Must be your cable provider. In the last few years, my cable provider (WideOpenWest) has given a free upgrade from 8Mbps to 16Mbps (and in the 5 years before that we went from 4Mbps to 8Mpbs), and introduced new 30Mbps and 50Mbps plans. Comcast has introduced 100Mbps a plan in many areas. Google has their first Gigabit city. I've heard a number of stories of municipals setting up their own internet service with speeds between 20Mbps and 100Mbps. Verizon Fios has new plans of 50Mbs, 75 Mbps, 150Mbps,
Re: (Score:2)
Another thing that plays into it: are you in a major metro area, or more rural? Rural areas are always going to lag behind. The first player in can justify it because they get 100% of the market. After that, additional service providers will only see a fraction of that return, thus it's not worth it to start up there, and thus the first player gets to enjoy their monopoly for a long time (so no incentive to upgrade). So, yeah, they'll always stay behind, but in the context of this thread (whether there's a
Re: (Score:2)
Wow, are you saying that if I multiply 5% times 200,000 users I would get, something like at least 5% of their usage? That's amazing.
Re: (Score:2)
Why bother re-encoding a complete music library from mp3 even if vorbis/aac is clearly the superior codec?
You're asking the wrong question, the right question is how many have FLAC copies of all their MP3s? Because I hope you weren't seriously suggesting they should re-encode from the MP3 files. I think you will find that many people have never even heard of FLAC and even if they did few tools have made it easy to create dual FLAC/MP3 rips of a CD, least not any the average person would have heard about. Assuming he didn't just download those MP3s in the first place and isn't about to chase down different copie
Re: (Score:2)
"this attempt never materialized"?? (Score:5, Insightful)
Apart from the awful English, WebM has been quite successful, too: a lot of software packages use WebM because they don't need to license H.264, and not just open source software.
Video standards aren't replaced overnight, and in fact, in a lot of places can't be replaced at all. The best way of dealing with these kinds of compatibility issues is to offer an alternative when people need to upgrade and change hardware/software anyway. So, let's hope that WebM can compete with H.265, because then we have a real chance of largely getting rid of proprietary video standards.
Re:"this attempt never materialized"?? (Score:5, Insightful)
So, let's hope that WebM can compete with H.265, because then we have a real chance of largely getting rid of proprietary video standards.
WebM could barely compete with H.264, so how the hell is it going to compete with H.265 which is going to offer the same quality at H.264 but only use about half the bitrate?
If Google could have improved WebM this much, they would have.
Re: (Score:2)
So, let's hope that WebM can compete with H.265, because then we have a real chance of largely getting rid of proprietary video standards.
WebM is a distribution codec for YouTube. H.264 is core technology in digital television.
Theatrical production. Cable, broadcast and satellite distribution. Home video. Industrial applications... The list goes on and on and on.
The licensors of H.264/HEVC are global giants in R&D and manufacturing. Philips. Samsung, Mitsubishi. Panasonic. Toshiba. The 1181 H.264 licensees operate on more or less the same scale. The standards they adopt are the standards which stick.
Re: (Score:2)
Which part of "video standards aren't replaced overnight" went over your head?
You can look at PNG/JPEG to see how this is likely to play out, except that the incentives to move from H.264/5 to WebM are actually stronger for many people.
Patent-encumbered standards are stupid (Score:4, Insightful)
The answer is some variant of "follow the money", I'm sure, but why doesn't the standards body in question require that the standard be truly open?
Re: (Score:2)
The standard IS open in that during definition of it anyone (paying to be a member) can contribute, provide feedback, and vote. If you meant free as in beer, they could have required that, but then none of the corporations that did the R&D would have participated and we'd have many "standards" and not just one.
Re: (Score:2)
This is a huge upgrade for any business pushing digital video through wires and radio waves. Even in the case where encoder and decoder licenses are a large cost, they wil
Re: (Score:2)
All of those patents are most likely incredibly trivial and all companies and organizations that sucessfully lobbied them in, did so not for their technological benefits but to make sure their patents were as widely used as possible.
If the ITU were to demand patent-free standards, they would be just as good but without the royalties.
Re: (Score:2)
I'm simply amazed at the depths of the dream world that some people live in.
Re: (Score:2)
H.265 is a very large improvement over H.264 (about 50% of the bit rate for equal quality)
According to wikipedia, it's 35.4% smaller
http://en.wikipedia.org/wiki/High_Efficiency_Video_Coding#Coding_efficiency [wikipedia.org]
Re: (Score:2)
Anyone who believes they can describe the efficiency difference between two video codecs as one percentage with 3 significant digits needs their head examined.
Re: (Score:2)
Apparently you don't know what the word average means, or how significant digits are used. The thing being measured has no impact on the number of significant digits you can use. It's purely determined by the precision of your measurements.
Re: (Score:2)
I don't have anything to add to what I wrote above. No one has defined a metric that can meaningfully distinguish between to video codecs to 3 significant digits.
H.265 will gain traction in video conferencing (Score:3)
Given how widespread H.264 hardware implementations are and the fact that blu-ray does not have H.265 I'd expect to see adoption first in the video conferencing world (SIP, H.323....CISCO/Tandberg, Polycom, etc)
For real time encoding H.265 can provide 30% reduction of bandwidth at the same bitrate. Transcoded content like what you might do at home will get some benefit but not as much as the real time stuff (streaming will benefit a lot too)
Re:H.265 will gain traction in video conferencing (Score:5, Interesting)
I also think that H.265 could find its way to satellite TV broadcasting, because its lower bandwidth requirements for 720p/1080i resolution video means they can add in more channels per satellite.
Re: (Score:3)
I also think that H.265 could find its way to satellite TV broadcasting, because its lower bandwidth requirements for 720p/1080i resolution video means they can add in more channels per satellite.
You might be waiting a while. We're still stuck with MPEG2 for our SD channels, over DVB-T and DVB-S.
Re: (Score:2)
Changing codecs is going to require outfits like Dish and DirecTV to replace all of the end user hardware. I'm not convinced that h265 is enough of an improvement for them to consider this.
The more legacy users you have, the harder it is for you to get buy-in on a new digital format. Incremental improvements will continue to be a harder and harder sell to people with legacy content and legacy equipment.
Re: (Score:2)
Because H.265 will has (I believe) half the bandwidth requirements of H.264, DirecTV or Dish Network could either cram in more channels or keep the current channel allocations but at MUCH higher video quality.
Re:H.265 will gain traction in video conferencing (Score:5, Interesting)
First adaptation, as usual, will be by HQ rip groups and anime fansubbers. These people pride themselves in being on the cutting edge and implementing stuff that isn't implemented anywhere in hardware yet. They were the guys who moved from h.264 high profile to h.264 10 bit high profile when h.264 hardware support started to become prevalent. They were the ones who moved to h.264 when divx hardware support became prevalent. Etc.
Funnily enough, it was the same for h.264, divx/xvid and so on. Frankly I wouldn't be surprised if many of the guys encoding that stuff actually work in the industry and use their "hobby" as a testbed for new encoding techniques and methods before they go to mass production.
Re: (Score:3)
I expect we'll see services like Netflix jump on the bandwagon pretty fast. They already produce multiple copies of their videos in different codecs to cater to different device capabilities. If memory serves, they do VC-1 for the desktop client, low bitrate h.264 for the mobile clients, and high bitrate h.264 for the STB/console clients.
Migrating platforms which can support it to h.265 will provide them with immediate savings. There aren't that many of them, but the PS3 happens to be their flagship and dev
Moronic Article (Score:4, Insightful)
The 'H' video encoding standards have NOTHING to do with free-to-use codecs. They are a COMMERCIAL industrial standard, designed to be reasonable and safe to license, because of the patent pool.
Complaining that H265 will include some royalty mechanisms is like complaining that the sky is blue! Even the document that will detail the final H265 standard will NOT be free, just as today you have to pay to get a copy of the H264 standard.
The open-source movement is not the same as demanding "death to capitalism" or the end of profit, as some very stupid people here seem to think. The 'H' standards have nothing to do with open-source. However, because the 'H' standards are not industrial secrets, open-source developers can and will develop open-source encoders and decoders.
Talk of WebM is pure garbage, since the key developers of x264 looked at the source Google released, and discovered that VP8 had illegally ripped off the H264 standard (badly), taking advantage of the fact that VP8 was originally closed-source. In other words, Google was conned (actually, this isn't true- Google knew full well that VP8 infringed hundreds of patents, but simply wanted to transfer millions to the owners of the company).
If people want to be activists over the royalty situation, it should be with this goal. Encoders, and encoded video (including streamed) should be royalty free. Only the decoders (hardware or software) should pay a royalty. This way, once you own your tablet, laptop, phone, or Windows, you have already paid for the licence to decode H265, allowing all apps to use this format freely.
The advantage of H265 (and H264) to end users is clear. Tiny, extremely energy efficient, hardware circuits can handle the video decoding, providing first quality video services on devices of all kinds. The standards allow software teams (like those behind x264) to produce insanely efficient, ultra-high-quality encoding solutions, and also allow work to progress on very fast (although low quality or very high bandwidth) hardware encoders.
H265 promises (if the encoding efficiency shown by x264 is possible for H265) 4K films on existing Bluray technology- which is essential since the collapsing market for disks means that it is most unlikely a new disk standard will ever replace Bluray.
To conclude. Standards are good, and some standards will involve royalties.
Re: (Score:2)
> The advantage of H265 (and H264) to end users is clear. Tiny, extremely energy efficient, hardware circuits can handle the video decoding
That's all well and good, but that's supposed to be what the advantages of h264 are and we've already got that and tons of legacy equipment and content.
On the other hand, most people are going to be hard pressed to notice any reason to want 4K given that BluRay is already a tough sell with anything much beyond a 1:1 viewing distance to screen size ratio.
but the patent issues plaguing H.264 remain (Score:2)
Yeah, those issues have certainly prevented h.264 from taking off. Luckily WebM adoption has roared out of the gate... *cough*
Missing the forest for the trees (Score:2)
Re: (Score:2)
As soon as 1 or 2 mainstream players support it ... my bet would be mid-to-late 2013
Re: (Score:2)
FYI, h.264 is a video compression format, and x264 is an encoder that produces h.264 output.
Saying "true sceners don't use h264 they use x264" is akin to saying "I don't drink coffee, I drink Folgers."