Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Youtube Google Graphics Movies News

YouTube Adds 'Leanback,' Support For 4K Video 204

teh31337one writes with news that YouTube has announced support for 4K video, which runs at a resolution of 4096 x 3072. From their blog: "To give some perspective on the size of 4K, the ideal screen size for a 4K video is 25 feet; IMAX movies are projected through two 2k resolution projectors. ... Because 4K represents the highest quality of video available, there are a few limitations that you should be aware of. First off, video cameras that shoot in 4K aren't cheap, and projectors that show videos in 4K are typically the size of a small refrigerator. And, as we mentioned, watching these videos on YouTube will require super-fast broadband." They provided a small playlist of videos shot in 4K. This announcement comes a few days after YouTube debuted "Leanback," a service that attempts to find and serve videos you'll like based on past viewing habits, as well as offering a simplified method of browsing.
This discussion has been archived. No new comments can be posted.

YouTube Adds 'Leanback,' Support For 4K Video

Comments Filter:
  • by Surt ( 22457 ) on Saturday July 10, 2010 @01:22PM (#32860868) Homepage Journal

    http://en.wikipedia.org/wiki/Super_Hi-Vision [wikipedia.org]

    4k video is so legacy.

    • by fuzzyfuzzyfungus ( 1223518 ) on Saturday July 10, 2010 @01:39PM (#32860956) Journal
      Particularly given the existence of films that are never actually filmed(ie. virtually anything Pixar has done, etc.) which make the existence of a camera that can actually handle a given resolution irrelevant to that resolution's "existence", the notion of a "highest resolution" seems rather meaningless.

      This goes double for any format with lossy compression(ie. pretty much all of them in any sort of practical use), where you could declare that your format is 16,000,000x9,000,000 pixels, and thus the awesomest available, and then compress it down to 1Mb/S. The result would look roughly like the original Wolfenstein; but it would be the highest resolution out there.
      • Re: (Score:3, Insightful)

        by sznupi ( 719324 )

        Also - tons of people actually have cameras perfectly capable of making videos in this resolution, assuming they are of quite specific kind - stop motion animation.

        But yeah, I would prefer better bitrates (and/or encoding methods; H.264 won't be the last word) in more "standard" resolutions than such things basically just for show. Vimeo has "only" HD, with with their higher bitrates they look better (plus one can download the initial file)

        • Given that this "Super Hi-Vision" did include a specially built camera capable of doing 60 FPS at that resolution, I give the engineers involved credit for that part of the system. There may also have been some interesting FPGA work done to compress the result in reasonable time. Aside from that, though, the whole thing seems like a stunt.
          • Re: (Score:3, Interesting)

            by migla ( 1099771 )

            You could also duct tape a few cameras together and use a computer to stich the different images together to get higher resolution...

      • Well if your generated graphics are vector based, then the resolution you rasterize to could have some effect on quality, but only if your screen is >= that resolution after all or it'll just be downscaled.

        • My point with the generated graphics was just that, if you are talking about videos actually shot with a camera at some specified frame rate, there is, in fact, a meaningful "highest resolution". If no camera presently built can do more than YxZ pixels at 24 FPS, or 60 FPS, or whatever you prefer; then it is meaningful to say that YxZ pixels is the "highest resolution".

          If you are talking about CGI, the output resolution is limited only by storage space and patience. In theory, there is still a "highest r
          • Pure CGI is not completely resolution-independent; a lot of the quality depends on how detailed the textures are.

          • It's not the camera so much as the playback medium. There's little point in mastering to some ridiculous large-format print resolution when hardware that can play it back at speed doesn't exist. That having been said, most effects work these days is done at 4k and downscaled to 2k on release, with really complex/detailed work mastered at 6k or 8k to give the artists extra resolution to nail the smallest details.

            When you watch films in true IMAX, you can often tell which shots/films were mastered at highe
      • The eye has a resolution of about 400 dpi on one meter distance*.
        So the resolvable angle is (1 inch / 400) / (1 m) = 6.35E-5 rad

        4K on 25 feet screens is 1.9 mm per pixel, or 13 dpi.

        Means you need to put that screen as far as 30m away, otherwise you could theoretically see pixels.

        With 8K, 15m. With your laptop, about 3m.

        *Hint: don't print in a finer resolution.

    • by Surt ( 22457 )

      Who in the heck moderated this off-topic? How could this possibly be any more on-topic? It directly refutes a claim made in the summary!

    • Re: (Score:2, Insightful)

      by yoyhed ( 651244 )
      Man, this is getting ridiculous. As sweet as it sounds, do we really need more than 1920x1080? Granted, I don't have a 4000-pixel-wide monitor, but on my 2048x1152, I can't tell a difference between this and 1080p at all.

      Then again, this is on YouTube. I'm sure compression brought the quality below a 1080p Blu-Ray the instant it was uploaded.
      • Re: (Score:3, Insightful)

        by AAWood ( 918613 )

        Of course you can't tell the difference; your monitor resolution means that the video is being rendered down to only a few percent over 1080p anyway, and the same will be the case for almost everyone. Support for this will cater to a niche audience for the moment, whilst also allowing for wider adoption of higher-resolution cameras, monitors and graphics cards. This is how it always is in the world of tech; we settle into a certain pattern of what we can expect our hardware to achieve, and then someone rele

        • by yoyhed ( 651244 )
          I knew someone would come back with the 640K thing. I'm not anti-progress, and of course technology advances at a breakneck pace. I just don't think it's a big deal that YouTube supports it right now, when 99% of people's biggest televisions and monitors don't even come close.

          I also know why I can't tell the difference, my point there was supposed to be that I've got what's considered a pretty high-end monitor by today's standards, and this 4000px video standard gives me zero benefit. It is INCREDIBLY ni
      • by adbge ( 1693228 )

        Granted, I don't have a 4000-pixel-wide monitor

        Nobody does. Consumer monitors are not built that support a native resolution of 4096 x 3072. You would hard pressed to find even a 60" TV that supports this kind of resolution.

        Hopefully, Google knows something we don't about the future of display resolutions and this is foreshadowing consumer grade monitors receiving a long needed bump in pixels per dollar. With Apple's "retina display", maybe higher resolutions will become trendy and spark a resolution war. I don't know much about the actual feasibilit

      • by Cyberax ( 705495 )

        "Man, this is getting ridiculous. As sweet as it sounds, do we really need more than 1920x1080?"

        Yes, we do. High-DPI text would look great if our displays support more than 1920x1080 with small enough size.

        • by yoyhed ( 651244 )
          Again, way to take my comment out of context. I was talking about 1080p as a video standard - considering it's barely even seeing adoption now in 2010. Obviously screen resolution and real estate are nice, but this isn't about monitor resolution, it's about the 4K standard.
      • by Macrat ( 638047 )

        Then again, this is on YouTube. I'm sure compression brought the quality below a 1080p Blu-Ray the instant it was uploaded.

        Not to mention that may bluray discs contain nothing more than a copy of the standard def mpeg2 data from the same DVD release.

      • Well, I love the effect that when I hold my phone’s super-high-DPI screen at just the right distance and angle, it looks like I’m looking trough a window when watching a good quality video on it.
        From what I heard, this is also the case for UHDV. If it feels *that* real, then it will be worth it for me. But please fill out my whole viewing angle. Those few degrees are really crappy. I want 180x180 degrees! In (visual) stereo! Or even better: all depths at the same time, so your eyes can do the re

      • Re: (Score:3, Insightful)

        by V!NCENT ( 1105021 )

        As sweet as it sounds, do we really need more than 1920x1080?

        Do we realy need 24bit colors? Do we realy need one gigabyte harddrives?! Do we...

        So let me ask you a question: Why do you think anti-aliasing exists? Why do you think most people still print out their emails/letters/rapports before sending them?

    • by Sycraft-fu ( 314770 ) on Saturday July 10, 2010 @03:35PM (#32861502)

      SHV is experimental tech. They are playing with it right now, but it isn't in use anywhere, even in the pro world. It is just proof of concept and early testing.

      4k is the high end of cinema. 4k is normally what you scan in and process film at (it is considered to be about the same as good 35mm). You can also get monitors that are very nearly 4k, and the high end digital cinema projectors are 4k. It is a currently used and in production format. If you go to a new, spiffy, digital theater and watch a movie like Avatar, it is probably a 4k projector (though some places with smaller screens use 2k instead, which is just a bit higher than 1080p).

      There's a difference between "Technology that is being developed," and "Technology that is being used."

      Take Ethernet. 100gb is currently under development. There are test units that exist, and the standard was finalized last month. However it is not a deployed technology. Your network does not have 100gb Ethernet backbones. 10gb is currently the fastest Ethernet out there. It is the fastest deployed in actual networks right now (fastest Ethernet, I know there are faster POS lines and so on).

      So it is accurate to say 4k video is the highest for now.

      • by Surt ( 22457 )

        Indeed, and TFS specifically says 'available'. Not 'commercialized' or anything more specific. 8K video is clearly 'available'.

    • Re: (Score:3, Interesting)

      by Mikkeles ( 698461 )

      Gee, that's almost the resolution needed to show a square inch of a Durer etching!

    • The question is not how many feet of screen you need, but what kind of after-market retina do you need to implant to take advantage of this. (And does your current optic nerve have the bandwidth, not to mention the back end behind it?) :)

      • by Ken_g6 ( 775014 )

        Based on this discussion of the iPhone "retina display" [wired.com], a wraparound 180-degree screen of 9,000 pixels in hemi-circumference would match the resolution of the human retina. So you don't need another retina - but you might need LASIK.

        Oh, and neural bandwidth isn't a problem. Maximum resolution is only achieved in a small area of the retina; if a computer could track your eye and move a small display with it, this kind of resolution wouldn't be necessary.

  • Yay HD. (Score:4, Funny)

    by Anonymous Coward on Saturday July 10, 2010 @01:31PM (#32860902)

    Now I can enjoy horse porn in glorious 4096 x 3072!

  • Not much use to me, my 21" CRT monitor only supports up to 2048x1536 and that at only 75Hz.

    So, the ideal screen for 4K is 25 feet (google says it's 300 inches) or 240 x 180 inch. So that makes it ~17 dpi (4096px/240in). Too low. With good, ~300dpi screen you would only need 20.48 x 15.36 inch screen, or 25.6 inch diagonal, doable, but probably nobody makes monitors with that high resolution. CRT is probably more doable than LCD though and nobody likes CRTs because you can't place several monitors one behind

    • With good, ~300dpi screen

      make that 200dpi...

    • So, the ideal screen for 4K is 25 feet (google says it's 300 inches) or 240 x 180 inch. So that makes it ~17 dpi (4096px/240in). Too low. With good, ~300dpi screen you would only need 20.48 x 15.36 inch screen

      And it would be entirely wasted, because you don't sit 12 inches from your screen to watch a movie, so you wouldn't be able to see a resolution that high anyway.

      • And a huge screen (for example, in a cinema) is more wasted to me - either I sit far enough to see the whole screen without needing to turn my head and see crap resolution or I sit near enough to see every detail, but on only a part of the screen. That's why, to me, dpi is more important than just size.

        • Try IMAX.The huge screen, larger than you can take in at one look, makes for a very different experience than a normal cinema screen. Especially for 3D.

          Though it's not suited for most films. It's best suited for natural history and space documentaries and the like.

    • nobody likes CRTs because you can't place several monitors one behind the other without taking a huge amount of space

      The rise of LCD monitors in a nutshell right there.

  • Mess (Score:3, Interesting)

    by Wowsers ( 1151731 ) on Saturday July 10, 2010 @01:32PM (#32860912) Journal

    Youtube makes a horrible mess of 1080p Hi-Def video and uses far too much CPU to display, on my system much more than the original HD video does, what would it do to video with more detail than Hi-Def?

    • And it's compressed to everloving hell, so you can't even tell that it's 4k. It's almost bad enough to just look like poorly upscaled 480p.
    • Re: (Score:3, Interesting)

      by cgenman ( 325138 )

      Youtube might be making a play for something. Maybe they want to be the video source for Anime conventions. Maybe they hope to get projected before movies at low-end theaters who don't have advertising contracts but who do have digital projectors. Maybe they want better-than-1080p resolution for those pesky high resolution PC monitors. Maybe they're just trying to counter the image that Youtube is still all about postage-stamp sized videos of squirrels getting drunk.

      Either way, 4K is pretty future proof

      • by cgenman ( 325138 )

        Now that I say all of that, YouTube might be making a play for Videos-On-Demand for theaters. Have an independent movie theater and want to show Casablanca on Valentine's Day? Why go to a distributor when you could go through YouTube's new Media On Demand service, pay a flat theater rate, and be ready to go in minutes?

        4k is a good way of hedging bets against future functionality.

  • by zogger ( 617870 )

    OK, I am downloading one just to try it out, there's no way in heck I can stream it, I can't even stream "normal" you tube vids yet, just can't get a good enough internet connection around here for that. So..what is this ultra high resolution for again? Who has a 25 foot screen at home? Why the bandwith wasteage? Really, just an honest question, if the bulk of humanity can't watch this in the manner it was designed for..why bother? Isn't this like driving around a 3 ton SUV to get to work in? Aren't we supp

    • Re: (Score:3, Insightful)

      by jvillain ( 546827 )

      I would take flashless WebM support over 4K all day long. I can only view less than 0.01% of the youtube content currently because of flash so I am not really that excited about 4K just yet.

      • Re: (Score:3, Informative)

        by afidel ( 530433 )
        You do know that you can view ~80% of youtube content in H.264 without flash by using a different URL, right?
    • Simple marketing stunt, I expect. For the cost of hosting a couple of stupidly large videos, and the bandwidth of a lot of people downloading the first 10 seconds and then giving up, and a couple of fiber-to-home users downloading the whole thing and then realizing that no x86 ever made can allow flash to decode video at that resolution in real time, Google gets a little more buzz about youtube.
      • by afidel ( 530433 )
        Flash 10.1 with GPU offload can probably decode 4k video just fine on any modern gaming card.
    • No imagination (Score:3, Insightful)

      by westlake ( 615356 )

      Who has a 25 foot screen at home?

      Well, someone must be buying them, when even Walmart has them for sale:

      Draper Cineflex Cineperm Fixed Frame Screen - 25' diagonal NTSC Format [walmart.com]

      Really, just an honest question, if the bulk of humanity can't watch this in the manner it was designed for..why bother? Isn't this like driving around a 3 ton SUV to get to work in?

      No.

      It's more like the open air cinema projects that began in the silent era:

      Open Air Cinema, [openaircinema.us] Open Air Cinema & Film Aid in Tanzania [openaircinema.us]
      FilmAid [filmaid.org]

  • by Beardydog ( 716221 ) on Saturday July 10, 2010 @01:38PM (#32860950)
    I just tried a couple seconds at 1080p, and a couple of seconds at 4k on a 1080p screen, and found the difference to be quite noticeable in the details. The downside was that my 8800GT can't actually play 4k video faster than 4fps. How about instead of a 4k option almost no one will use, we try a 1080p option that doesn't have massive blocks, fringes, and blurring.
    • by fuzzyfuzzyfungus ( 1223518 ) on Saturday July 10, 2010 @01:57PM (#32861056) Journal
      It's rather irksome how effectively marketers have pushed "resolution" rather than bitrate as a metric of video quality, despite the fact that, with digital video, the latter is generally far more important than the former(except, of course, for output devices like monitors and projectors, where the number of physical pixels really does matter, and input devices like cameras, where the number of pixels matters, along with the quality of the glass, degree of compression, and a bunch of other fiddly stuff).

      As 20 seconds in the image manipulation program of your choice will easily demonstrate, you can resize an image(and, by extension, a series of images) from any resolution you have to any resolution you want, subject only to the limits of your RAM and your patience.

      If all video were lossless, or there were some iron law stating "though shalt allocate no less than X bits per Y pixels", comparing videos by resolution might actually matter. As it is, though, in most real world situations, the limitation is in the bitrate(unless you have a really crap monitor), and, while you can smear your too-few-Mb/s mpeg4 over as "high resolution" an output as you like, it isn't going to look any better.
      • except, of course, for output devices like monitors and projectors, where the number of physical pixels really does matter.

        Your monitor isn't an exception, it just has sufficient bandwidth to display uncompressed video. It other words, "perfect" bit rate.

        And bit rate isn't the whole story; the method of compression has a tremendous effect on video quality. 1920x1080 30p ATSC takes the same amount of bandwidth as 480i NTSC. 320x240 24p uncompressed RGB video is about the same bit rate as Blu-Ray. If marketers pushed bit rate instead of resolution, we'd just have huge files with lousy compression.

        There are just too many varia

    • Re: (Score:3, Informative)

      by Jerf ( 17166 )

      What that basically means is that your 1080P video was overcompressed and did not actually contain "1080P"-worth of information. The 4K video is probably overcompressed and doesn't contain "4K" worth of information either, but it had more than the 1080P video. (In fact there's a decent chance the 4K video is simply about 1080P's worth done right.) You shouldn't be able to tell.

      Variable bit rate encodings means that resolution is pretty much a fiction, as others have pointed out in this discussion.

      This is on

      • by cgenman ( 325138 )

        The Xbox 360 used to be a perfect example of this.

        Buy a video of something in SD. Then buy it again in HD. Watch them both on an SD screen. Theoretically, they should be basically exactly the same. But as it stood, the HD version displayed on an SD screen had far less artifacting, smoother black gradients, and a more solid apparent framerate. Therefore, the "HD" rate was actually about adequate for SD video. Sadly, at least at first it wasn't up to the task of HD video.

        The same can be said of cameras.

    • Ya 4k is stupid (Score:3, Informative)

      by Sycraft-fu ( 314770 )

      There are no consumer 4k monitors out there, none. You CAN find 4k large displays if you try. Barco makes some that are close (3840x2160) like their LC-5621 but that costs nearly $40,000. 4k is just not the sort of thing you find on a desktop PC or in a consumer's home.

      As such doing video on a site like Youtube in it is worthless. Actually it is worse than worthless since, as you noted, it overloads the decoding ability of current hardware and causes slowdown. There is just no point, at all, on current desk

    • by Deorus ( 811828 )

      Was coming to report the same. How sad... My 100mbps Internet connection streams the whole thing just fine, but my 8800GTS grinds to a halt attempting to play it... Never thought I'd see the day when the bottleneck would be on my computer rather than on my Internet connection...

  • Stop the hatin' (Score:5, Insightful)

    by IGnatius T Foobar ( 4328 ) on Saturday July 10, 2010 @01:54PM (#32861040) Homepage Journal
    I see it already, the army of Slashdotters saying "no one has the bandwidth for this" and "no one has the video hardware for this" and "YouTube's implementation of this sucks." Well, that's ok. The point is that they're pushing the limits. Remember the first time you saw any video at all on a computer? Chunky, blocky, slow, tiny video coming off a CD-ROM in the early 1990's, perhaps? Yeah, it sucked, but the point was that they were showing something that would, eventually, evolve into something useful. Without the crappy CD-ROM graphics of the early 1990's, there would be no YouTube today. Someone's got to be the first to try it, someone's got to get the technology out there so it can be improved. Wouldn't you like to eventually watch YouTube in HD directly on your television? Today you've got to jump through hoops to do that. Tomorrow it might be as effortless as watching YouTube on your desktop computer.
    • And in a sense we're already there on the capture end -- you can plunk down $300-$400 and get a digital SLR (or Micro Four Thirds "thing that is like a SLR but with no reflex mirror") that shoots at precisely this resolution. The only thing stopping them from storing video at these framerates is the ability to get all the data off of the CMOS sensor and onto the memory card fast enough. 4000 x 3000 x 30 fps x 12 bits per pixel is 4.3Gbps, which is hard. But Olympus already makes a cheap digital SLR that wil

    • Re:Stop the hatin' (Score:4, Insightful)

      by Kev Vance ( 833 ) <kvance@NOSPaM.kvance.com> on Saturday July 10, 2010 @03:51PM (#32861552) Homepage

      Thank you! Reading page after page of complaints about this was disheartening. Not everyone has lost their sense of imagination.

      • by Fumus ( 1258966 )

        Thank you! Reading page after page of complaints about this was disheartening. Not everyone has lost their sense of imagination.

        There is one person roughly half-way though that seemed to be quite excited about the prospect of more detailed animal por^H^H^Hvideos.

    • When colour television came out is was revolutionary.
      When VCRs allowed you to record your show it was revolutionary.
      When DVD came out and allowed you perfect images, additional content, and a quality increase everyone could see even with an old CRT it was revolutionary.

      Now Blu-ray is out and has a slightly higher resolution and the headscratching idea that a TV must be connected to the internet. Many people don't have a TV with which they can tell the difference in quality. This is evolutionary.
      With
  • A hundred times more useful than a 4k resolution would be to allow videos more than 10 minutes long. Or better quality for 1080p. Or, my pet favourite, frame rates at around 60fps (and yes, obviously 60fps apx video does appear much smoother than half that frame rate as has been discussed countless times on /. ).

    • by sznupi ( 719324 )

      They would have to pay MPEG-LA for each streaming of H.264 if the video is more than 12 minutes in length.

      • This could enable longer videos, sure.

        Make a 4000x3000 video where each frame consists of a mosaic of 16 successive frames of a 1000x750 video, and then release a third-party plugin that will pull out the frames and play the original video.

  • by Animats ( 122034 ) on Saturday July 10, 2010 @02:11PM (#32861122) Homepage

    James Cameron (Titanic, Avatar, etc.) says that higher frame rates are more valuable than higher resolution. He wanted to do Avatar at 48FPS, but the technology wasn't there yet. The sequel probably will be at a higher frame rate. Cameron points out that 4K resolution is worthless beyond the first few rows of the theater, but frame rate benefits all viewers.

    It's a real issue for Cameron, who, as a director, likes sweeping panoramas with high detail. If you pan slowly over a high-resolution scene at 24FPS, there are visible artifacts. This precludes certain shots which look great and ought to be in the movie. It's necessary to defocus slightly or add motion blur for certain shots.

    So YouTube should work on getting their frame rates up, not their resolutions. Let's see some IMAX movies at 48FPS on YouTube.

    • Re: (Score:3, Interesting)

      Yes, and after frame rate higher dynamic range. Screen resolution is a distant third.

    • by karnal ( 22275 )

      So, once James Cameron does this, he can release a press statement:

      "I've upped my frame rates. Up yours!"

    • When Doug Trumbull invented Showscan. [wikipedia.org]

  • Oh dear (Score:4, Funny)

    by Quiet_Desperation ( 858215 ) on Saturday July 10, 2010 @02:24PM (#32861184)

    And a million internet tubes cried out in pain.

    "Holy fuck! How many pixels by how many! What? HOW MANY? WTF? o_O We're gonna need a bigger pipe!"

  • ...sounds a lot like this thing called "television" that we used to have back in the last century.

  • All it takes to view these videos in their native resolution is a $60,000 4k monitor like the one available here : http://www.bhphotovideo.com/c/product/676516-REG/Astro_Design_Inc_DM_3410_4K_x_2K_10.html [bhphotovideo.com]

    For a billionaire, 60 grand is not even 1/100 of a percent of their total fortune. Not to mention that they could have google pay for the screen because technically messing around with 4k resolution is a business expense....(even if Larry or Sergey were using it to view equine pornography)

    • Ok, this is like the third or fourth post I've seen in this discussion mentioning horse porn. Did I fucking miss something?

  • http://www.digitalsociety.org/2010/07/youtube-adds-4k-video-capability-but-how-improving-1080p-first/ [digitalsociety.org]
    Google just announced that YouTube will now support “original” resolutions of up to “4096P”, but it’s actually a maximum of 3072P narrowscreen or 2304P widescreen. This announcement makes it sound as if our computers and broadband connection lags Google YouTube when YouTube is actually the weakest link. YouTube’s biggest problem is their over compressed “HD” v
    • by gmueckl ( 950314 )

      The term "4096p" is a term that the journalist copying from the original blog post came up with. It's clearly wrong: the video does not have 4k rows of pixels. The the term "4k" for the resolution comes from the 4k pixels horizontally. The digital big screen film formats have been classified into "2k" and "4k" categories because of their horizontal numbers of pixels for some 10 years or so now. The 1080p HDTV format is slightly below the 2k big screen resolutions. But in television, the people are counting

  • My computer's GPU fan just kicked on for the very first time since I bought this machine about 6 months ago.

    At first I thought there was something wrong.

    And then I realized that, for the first time... there was something right.

  • Don't know where they get that from, the wikipedia article the article links to doesn't even back that up (with the exception of 3D, but there you have one projector for each eye, which does not increase resolution); all you have to do is look at the projection booth at an IMAX theater; at the Oregon Museum of Science and Industry (and, apparently, several other Omnimax/IMAX Dome theaters), the projection room is a big glassed in room where you watch the projector in action showing the movie before the one

  • Youtube can't even stream 720p in the evenings (at least here in Sweden)... Maybe they should solve that problem first...

  • You'd figure with YT being owned by Google and all that their search would be great, but it's still largely rubbish. When're they going to fix that?
  • I hope that Leanback thing is better than the current suggestion system.
    After seeing 300+ videogame videos, 200+ videos of avian life, and 100 cooking-related shows, all I get is suggestions about videos some random person linked me to via IM, that are extremely offtopic. So much "youtube poop" and crap like that. Also I once watched a SINGLE video about a game someone wanted me to see, I think it was supreme commander or something like that, and I still keep getting 5+ of those in my suggestions every upda

  • It would be more useful if a link is simply provided to download the original resolution file without having to mess around with Javascript. In Safari, when you try to download by opening the Activity window and doubleclicking the streaming file, instead of it showing up in the Downloads window as normally a browser window is opened and the movie plays in a Quicktime movie player embedded in the browser.

    As others note, longer submissions and frame rates, and no compression would be more useful than 4K. And

  • IMAX is NOT 2K (Score:3, Informative)

    by hamiltondaniel ( 1406971 ) on Sunday July 11, 2010 @06:51PM (#32869810)
    Whoever wrote this does not know anything about IMAX. IMAX is not projected digitally, let alone with a 2K digital projector.

    35mm film is about equal, or a little better than, 4K digital in terms of resolution. Most all of the time when you go see a movie these days, it is still being projected on 35mm film. It's cheaper than a digital projector and looks better. When you went and saw Avatar in 3D it was being projected in 2K (for almost everyone) digital and that's why you could see the pixels on the screen. 2K is NOT good enough for anything but very small movie screens. Anyone who says it is is not a cinematographer (I am) and has never used both a 35mm film camera and the best digital cinema cameras (I have), and probably doesn't know what a cinematographer even really is. 99% of all big-budget films are still shot on 35mm film because it is simply the third-best image-capturing method out there, better than ANY digital camera in existence today. It is also much more expensive, but on large films the price of film is a drop in the bucket compared to everything else.

    The second best cinema image-capturing method out there is 65/70mm film.

    The best is IMAX.

    IMAX is 65/70mm film travelling through the camera horizontally; each frame is about 2.75 by 2 inches. That is enormous. It's like a medium-format still camera...except 24 times a second. Here's a comparison of IMAX to regular 35mm film (most digital cinema cameras have sensors, by the way, about exactly the same size as a 35mm film camera): http://en.wikipedia.org/wiki/File:Imaxcomparison.png [wikipedia.org]

    IMAX is NOT projected on 2K. "Digital IMAX" is. Digital IMAX is pretty much useless and is not even as good as standard 35mm film projection; it uses two 2K projectors overlapping each other to give a slightly higher than 2K theoretical resolution to the image; for those of you with still cameras, 2K is about equal in resolution to a 6- or 7-megapixel camera. Congratulations. Your $1000 SLR has way more resolution than a digital cinema projector that costs a half million dollars.

    Real IMAX, i.e. horizontal 65/70mm film, has an estimated resolution of about 104 million pixels; you would need a 12K x 9K digital sensor to even come close to the resolution of IMAX. No one makes those and no one will for a long time, if ever. The highest-resolution digital photographic sensors outside of the military are probably Hasselblad digital medium format backs; they are about 60 megapixels, or half the resolution of IMAX film, and they are still cameras capable of only about one frame a second.

    IMAX is not 2K. Digital IMAX is not IMAX.

    IMAX is film. Film is incredible.

It is easier to write an incorrect program than understand a correct one.

Working...