Early digital cinema

What was it like watching a 1080p movie blown up onto a huge cinema screen? Was it noticeable, or did people just go with it?

  1. 6 months ago
    Anonymous

    It looked unironically better, more soulful and more filmic than the 8Kslop of today.
    Color grading and palettes in general have also gone downhill a lot since then. Watch any digital film from the early 00s and compare it to modern shit. It's very noticeable.

  2. 6 months ago
    Anonymous

    People were watching TV on CRTs and early flatscreens in 480p so it was hard to tell either way.

    • 6 months ago
      Anonymous

      Yeah, but watching shitty resolution stuff on a small screen is way less distracting then watching it on the big screen.

      • 6 months ago
        Anonymous

        The point was people were watching stuff in shit quality 24/7 so that was their base of reference. Back then almost no one could spot the difference.

      • 6 months ago
        Anonymous

        Lossless 1080p isn't the same as shitty YiFy rips, dumbass.

  3. 6 months ago
    Anonymous

    Standards were different back then and thus we were more forgiving. For example, I remember thinking KoTOR had great graphics, but going back on it now, it's so bad. The "crowd" for the dueling arena are low-res sprites lol. Still a great game though.

  4. 6 months ago
    Anonymous

    I remember watching the movie, I never gave a thought to that shitl, all I knew was that DVD was awesome and next best thing was actually going to the cinema

  5. 6 months ago
    Anonymous

    Is that a star bores movie? None of them looked good. The full orchestra used sounded good. That's all.

  6. 6 months ago
    Anonymous

    I don't understand the question. Nobody gave a fuck about how many pixels were on screen back then, it was all about how many inches it was.

  7. 6 months ago
    Anonymous

    >1080p blown up
    That's not how it worked at all

  8. 6 months ago
    Anonymous

    It looks fine because most of AOTC showings were converted to 35mm, like putting a CRT filter on an old console game playing on a modern screen.

    • 6 months ago
      Anonymous

      yes

      • 6 months ago
        Anonymous

        Colors look off

        • 6 months ago
          Anonymous

          it's 35mm
          colors degrade with time

          • 6 months ago
            Anonymous

            Sounds logical. Anywhere we could download a 35MM version?

      • 6 months ago
        Anonymous

        Do you know the difference between lossy and lossless compression?

  9. 6 months ago
    Anonymous

    I’m betting AOTC was rendered in 2k and most people saw it on film so it looked normal.

  10. 6 months ago
    Anonymous

    Saw it twice in theaters , in an era where most of the projections were on film. Ep2 was a bit blurry and had this video motion blur, but it was fine in a 35mm theater.

    I saw it the second time in one of the only digital screening available. It was sharper, but more video than ever.

    I sa

  11. 6 months ago
    Anonymous

    Um... you DO know that most movies had a 2K Digital Intermediate as standard until very recently, right? 2K and 1080p resolution have the same number of vertical lines: 1,080. However, a 2K image is slightly wider than 1080p. If you watched stuff like Avengers: Infinity War and Avengers: Endgame at the cinema; then you almost certainly watched a 2K digital projection as those movies were even shot completely digitally so there's no chance you saw them projected from something like 35mm film stock.

    • 6 months ago
      Anonymous

      I haven't watched a movie in years, and as a kid I obviously wasn't observant or mindful enough to care.

      • 6 months ago
        Anonymous

        It seems like only recently (since last year?) that a 4K Digital Intermediate is becoming standard with big-budget Hollywood releases, but 2K is probably going to stick around for a while yet. Yes 4K is better, but there's nothing "wrong" with 2K. Even Spider-Man: No Way Home in 2021 had a 2K master format.

  12. 6 months ago
    Anonymous

    >What was it like watching a 1080p movie
    attack of the clones wasn't even 1080p

    it was literally shot in 720p. this isn't a joke btw. no better quality exist of this movie.

    • 6 months ago
      Anonymous

      I’m fairly certain it was shot in 1080i not p AVC

      • 6 months ago
        Anonymous

        >le AVC
        AVC didn't even exist in 2002.

        • 6 months ago
          Anonymous

          I’ll have to look through my notes, I remember watching a special on digital cams and they mentioned this being the first big movie to shoot on the Sony digital cams they had custom fit for the new trilogy

          • 6 months ago
            Anonymous

            That's not the same as AVC.
            >SMPTE 367M, also known as SMPTE D-11, is the SMPTE standard for HDCAM. The standard specifies compression of high-definition digital video. D11 source picture rates can be 24, 24/1.001, 25 or 30/1.001 frames per second progressive scan, or 50 or 60/1.001 fields per second interlaced; compression yields output bit rates ranging from 112 to 140 Mbit/s. Each D11 source frame is composed of a luminance channel at 1920 x 1080 pixels and a chrominance channel at 960 x 1080 pixels. During compression, each frame's luminance channel is subsampled at 1440 x 1080, while the chrominance channel is subsampled at 480 x 1080.
            https://en.wikipedia.org/wiki/HDCAM

            • 6 months ago
              Anonymous

              Sounds pretty high quality:
              >140mbps
              >1440x1080 luma
              The only shitty part is:
              >480x1080 chroma subsampling

              • 6 months ago
                Anonymous

                That's not the same as AVC.
                >SMPTE 367M, also known as SMPTE D-11, is the SMPTE standard for HDCAM. The standard specifies compression of high-definition digital video. D11 source picture rates can be 24, 24/1.001, 25 or 30/1.001 frames per second progressive scan, or 50 or 60/1.001 fields per second interlaced; compression yields output bit rates ranging from 112 to 140 Mbit/s. Each D11 source frame is composed of a luminance channel at 1920 x 1080 pixels and a chrominance channel at 960 x 1080 pixels. During compression, each frame's luminance channel is subsampled at 1440 x 1080, while the chrominance channel is subsampled at 480 x 1080.
                https://en.wikipedia.org/wiki/HDCAM

                HDCAM that’s right and yeah it’s 4:2:0 lmaoooooooo

              • 6 months ago
                Anonymous

                It's actually 4:1:1
                4:2:0 would be 960x540

          • 6 months ago
            Anonymous

            AVC history aside, yes.

            https://en.wikipedia.org/wiki/CineAlta#History_and_use_in_motion_pictures

  13. 6 months ago
    Anonymous

    You've probably seen it a few times yourself since 2K is a fairly normal resolution size in digital theatres.

  14. 6 months ago
    Anonymous

    I saw Public Enemies in a cinema and it looked like it was shot on a mobile phone

    • 6 months ago
      Anonymous

      Yes I recall feeling the same way after the theater. I think it was because Michael Mann directed it. Not fully realizing pitfalls of early digital cinema filmmaking would have. He filmed parts of Ali and most Collateral and I think Miami Vice digitally. I think Miami Vice (2006) was one of the worse contenders until the Hobbit films came along, but it was very much a stylistic decision to "update" the look to the 2000s.

      Shame because I think his other films like Heat, Manhunter, The Last of the Mohicans, Thief, look great.

      • 6 months ago
        Anonymous

        Mann did something odd with Collateral that gives it some of the effects of high frame rates but not all of them. I assume its something about the exposure time for each frame since he shot it at night, makes the motion blur go away and makes thing move smoother.

  15. 6 months ago
    Anonymous

    - Most movie projectors are still 1080x2048
    - Almost all effects and ALL animation is still done at 2K, because despite the marketing horseshit, people literally can't tell the difference.

    • 6 months ago
      Anonymous

      The only issue is that regular blu-rays are 800p. I've started downloading 4K remuxes and compressing them to 2520x1080 AV1 (21:9). The best of both worlds.

  16. 6 months ago
    Anonymous

    The Hobbit movies were projected at 4K at their IMAX premieres, but the effects scenes were all still done at 2K
    ... Which is why there was all of those complaints about how shitty everything suddenly looked when a dragon came onscreen or something...

    Oh wait, there were no complaints at all BECAUSE PEOPLE LITERALLY COULDN'T TELL THE DIFFERENCE EVEN ON A SCREEN BIGGER THAN THEIR HOUSE

    • 6 months ago
      Anonymous

      People were too distracted by the HFR to notice the problems with the effects.

      • 6 months ago
        Anonymous

        The HFR is the only redeeming quality of the Hobbit trilogy.

        • 6 months ago
          Anonymous

          hahah I remember going to see the midnight showing with HFR and EVERYONE gasped in disgust the moment we saw old ass bilbo walking down the hallway with the candle in the very beginning.

    • 6 months ago
      Anonymous

      Dude there were a LOT of noticable bad CGI effects in the Hobbit films and I cringed multiple times watching them in theatre.

      The worst offenders are probably the practical effects though, you can see all the makeup and fake beards whenever an actors face is in closeup. Might be one of the reasons why CGI has become more prevalent actually.

  17. 6 months ago
    Anonymous

    hershlag still made dicks hard in 2k so nobody cared

  18. 6 months ago
    Anonymous

    I have all 3 movies as regular 1080p x264 rips and this one looks absolutey abhorrent. All are from the same group.
    Is this just my release or is the 2nd one really looking way worse than the 1st and 3rd?

Your email address will not be published. Required fields are marked *