What was it like watching a 1080p movie blown up onto a huge cinema screen? Was it noticeable, or did people just go with it?
What was it like watching a 1080p movie blown up onto a huge cinema screen? Was it noticeable, or did people just go with it?
It looked unironically better, more soulful and more filmic than the 8Kslop of today.
Color grading and palettes in general have also gone downhill a lot since then. Watch any digital film from the early 00s and compare it to modern shit. It's very noticeable.
People were watching TV on CRTs and early flatscreens in 480p so it was hard to tell either way.
Yeah, but watching shitty resolution stuff on a small screen is way less distracting then watching it on the big screen.
The point was people were watching stuff in shit quality 24/7 so that was their base of reference. Back then almost no one could spot the difference.
Lossless 1080p isn't the same as shitty YiFy rips, dumbass.
Standards were different back then and thus we were more forgiving. For example, I remember thinking KoTOR had great graphics, but going back on it now, it's so bad. The "crowd" for the dueling arena are low-res sprites lol. Still a great game though.
I remember watching the movie, I never gave a thought to that shitl, all I knew was that DVD was awesome and next best thing was actually going to the cinema
Is that a star bores movie? None of them looked good. The full orchestra used sounded good. That's all.
I don't understand the question. Nobody gave a fuck about how many pixels were on screen back then, it was all about how many inches it was.
>1080p blown up
That's not how it worked at all
It looks fine because most of AOTC showings were converted to 35mm, like putting a CRT filter on an old console game playing on a modern screen.
yes
Colors look off
it's 35mm
colors degrade with time
Sounds logical. Anywhere we could download a 35MM version?
Do you know the difference between lossy and lossless compression?
I’m betting AOTC was rendered in 2k and most people saw it on film so it looked normal.
Saw it twice in theaters , in an era where most of the projections were on film. Ep2 was a bit blurry and had this video motion blur, but it was fine in a 35mm theater.
I saw it the second time in one of the only digital screening available. It was sharper, but more video than ever.
I sa
Um... you DO know that most movies had a 2K Digital Intermediate as standard until very recently, right? 2K and 1080p resolution have the same number of vertical lines: 1,080. However, a 2K image is slightly wider than 1080p. If you watched stuff like Avengers: Infinity War and Avengers: Endgame at the cinema; then you almost certainly watched a 2K digital projection as those movies were even shot completely digitally so there's no chance you saw them projected from something like 35mm film stock.
I haven't watched a movie in years, and as a kid I obviously wasn't observant or mindful enough to care.
It seems like only recently (since last year?) that a 4K Digital Intermediate is becoming standard with big-budget Hollywood releases, but 2K is probably going to stick around for a while yet. Yes 4K is better, but there's nothing "wrong" with 2K. Even Spider-Man: No Way Home in 2021 had a 2K master format.
>What was it like watching a 1080p movie
attack of the clones wasn't even 1080p
it was literally shot in 720p. this isn't a joke btw. no better quality exist of this movie.
I’m fairly certain it was shot in 1080i not p AVC
>le AVC
AVC didn't even exist in 2002.
I’ll have to look through my notes, I remember watching a special on digital cams and they mentioned this being the first big movie to shoot on the Sony digital cams they had custom fit for the new trilogy
That's not the same as AVC.
>SMPTE 367M, also known as SMPTE D-11, is the SMPTE standard for HDCAM. The standard specifies compression of high-definition digital video. D11 source picture rates can be 24, 24/1.001, 25 or 30/1.001 frames per second progressive scan, or 50 or 60/1.001 fields per second interlaced; compression yields output bit rates ranging from 112 to 140 Mbit/s. Each D11 source frame is composed of a luminance channel at 1920 x 1080 pixels and a chrominance channel at 960 x 1080 pixels. During compression, each frame's luminance channel is subsampled at 1440 x 1080, while the chrominance channel is subsampled at 480 x 1080.
https://en.wikipedia.org/wiki/HDCAM
Sounds pretty high quality:
>140mbps
>1440x1080 luma
The only shitty part is:
>480x1080 chroma subsampling
HDCAM that’s right and yeah it’s 4:2:0 lmaoooooooo
It's actually 4:1:1
4:2:0 would be 960x540
AVC history aside, yes.
https://en.wikipedia.org/wiki/CineAlta#History_and_use_in_motion_pictures
You've probably seen it a few times yourself since 2K is a fairly normal resolution size in digital theatres.
I saw Public Enemies in a cinema and it looked like it was shot on a mobile phone
Yes I recall feeling the same way after the theater. I think it was because Michael Mann directed it. Not fully realizing pitfalls of early digital cinema filmmaking would have. He filmed parts of Ali and most Collateral and I think Miami Vice digitally. I think Miami Vice (2006) was one of the worse contenders until the Hobbit films came along, but it was very much a stylistic decision to "update" the look to the 2000s.
Shame because I think his other films like Heat, Manhunter, The Last of the Mohicans, Thief, look great.
Mann did something odd with Collateral that gives it some of the effects of high frame rates but not all of them. I assume its something about the exposure time for each frame since he shot it at night, makes the motion blur go away and makes thing move smoother.
- Most movie projectors are still 1080x2048
- Almost all effects and ALL animation is still done at 2K, because despite the marketing horseshit, people literally can't tell the difference.
The only issue is that regular blu-rays are 800p. I've started downloading 4K remuxes and compressing them to 2520x1080 AV1 (21:9). The best of both worlds.
The Hobbit movies were projected at 4K at their IMAX premieres, but the effects scenes were all still done at 2K
... Which is why there was all of those complaints about how shitty everything suddenly looked when a dragon came onscreen or something...
Oh wait, there were no complaints at all BECAUSE PEOPLE LITERALLY COULDN'T TELL THE DIFFERENCE EVEN ON A SCREEN BIGGER THAN THEIR HOUSE
People were too distracted by the HFR to notice the problems with the effects.
The HFR is the only redeeming quality of the Hobbit trilogy.
hahah I remember going to see the midnight showing with HFR and EVERYONE gasped in disgust the moment we saw old ass bilbo walking down the hallway with the candle in the very beginning.
Dude there were a LOT of noticable bad CGI effects in the Hobbit films and I cringed multiple times watching them in theatre.
The worst offenders are probably the practical effects though, you can see all the makeup and fake beards whenever an actors face is in closeup. Might be one of the reasons why CGI has become more prevalent actually.
hershlag still made dicks hard in 2k so nobody cared
I have all 3 movies as regular 1080p x264 rips and this one looks absolutey abhorrent. All are from the same group.
Is this just my release or is the 2nd one really looking way worse than the 1st and 3rd?