Showing all 104 replies.
>>
>>
>>
>>
>>
>>738746067
It was the era of "your eyes can't see above 30 FPS" retards constantly excusing everything, so developers and publishers were like "LMAO they are defending us so much that we don't even need to optimize our games".
Simple as.
>>
>>
>>
>>
File: Screenshot 2026-05-12 211322.png (679.2 KB)
>>738746067
Difficult to program for, took them years to work it out. I remember big hype when Lair was announced, it was going to feature huge battles never seen before on console and use the sixaxis controls to make you really feel like you were controlling a dragon.
It like like shit and controlled like shit, the worst thing is you could ONLY use it with the sixaxis controls, they eventually patched it later on but only after massive backlash
>>
File: 1688429123121166.png (149.8 KB)
>>738746067
>why did the best games of all time run like shit on an underpowered system?
>>
>>
>>
>>
>>
>>738746651
>PS3 era was some dark times. Every port was 540p 22fps and the exclusives ran at 720p unstable 30. the 360 was way better.
The 360 really wasn't much better. It had tons of games that couldn't really scrape 30 fps together.
>>738746632
Wasn't underpowered at launch. Just didn't age well at all.
>>
>>738746651
The Xbox 360 was so much better than the PS3 that even the JAPANESE started developing for it primarily. That's how badly Sony fucked up. They got Mark "Our Guy" Cerny to design the PS4, though, and things balanced back out until Nintendo purchased the entire Japanese archipelago and enslaved its inhabitants.
>>
>>
>>
>>738746067
After getting a gaming pc and enjoying vidya at 60 fps minimum it seems insane to me that the 20 fps most ps3 games ran at was ever acceptable. It's a fucking slideshow and you can't reliably control your character.
>>
>>
>>
>>
>>738747282
>>738747379
Something like Ocarina of Time or Shadow of the Colossus looks like a kid's first animation on a post-it note pad, but after twenty minutes you just forget. Frame pacing is a much bigger issue, which is why Bloodborne is still so infamous.
>>
>>738747269
>the processor isn't very good for vidya
It's a lot more than that.
>PS3 used larger assets because of bluray
>Assets read slowly from disc
>The GTX7600 the GPU was based on dated horribly; replaced by unified shader model architecture in the 8000 series.
>Processor required XDR RAM
>XDR RAM was locked to the CPU
>Xbox 360 had a shared pool of RAM
>Xbox games could often use ~300-320MB of RAM for GPU; PS3 could only use the 256MB of GDDR3 for the GPU
>Nvidia made Sony pay every time they wanted PSGL (a custom graphical library for the PS3) to be updated
And then on top of that, you had all the CPU shenanigans. I've heard the PS3 was originally supposed to use two Cell CPUs in tandem, no GPU, but this was abandoned last minute, and they ended up buying Nvidia chips.
>>
>>738747650
SotC was never bearable on the PS2. I remember renting it when it came out and it's literally 15fps at best with retarded motion blur and bloom effects blinding you at every turn. The PS3 remaster should be considered the official version.
>>
>>
File: 1700307741370453.gif (712.2 KB)
>>738746067
Consumer standards were lower.
I'm still amused when I remember that Sony fanboys at the time, here on /v/ as well, actually thought the pre-rendered cutscenes were real-time graphics.
>>738747269
>>738747647
And the memory.
The 360's unified 512MB was a much smarter approach than separate 256MB pools of system RAM and VRAM.
Also pretty much everyone else had given up on Rambus and their memory architecture, but Sony just had to be extra special.
>>
>>
>>
>>
>>
>>738747908
SotC must have been one of the worst performing games on ps2 right? It's still one of my favorite games but the framerate was always one of those issues I just accepted because everything else was so good
>>
>>
>>
>>
>>
File: 1578415699722.png (16.6 KB)
>>738747908
>The PS3 remaster should be considered the official version.
I think it broke a lot of the graphics effects though
>>738749059
>>738749486
A lot of PS2 games drop down to the teens all the time, GTA SA runs like shit too but millions of people remember it fondly.
Besides SotC I think one of the absolute worst PS2 games I've tried was the Mafia port, I'm not sure if anyone actually played through the whole game on a PS2 or whether it's even possible to do it.
>>
>>738749726
>A lot of PS2 games drop down to the teens all the time, GTA SA runs like shit too but millions of people remember it fondly.
I was like 10 when SA came out. There wasn't anything else like it. VC was cool, but SA was jaw-dropping amazing. I remember copying out cheat codes for friends.
>>
>>
File: 1642062345993.png (216.2 KB)
um this always happened. NES games ran like shit, SNES games ran good, PS1/N64 games ran like shit, PS2/Gamecube games ran well, PS3/360 games ran like shit, PS4 games ran well, PS5 games run like shit
>>
>>
>>738746067
Because console games ran like shit from the PS1 to the PS4. The PS5s biggest upgrade was offering 60 fps modes for most games and it generally also hit that target. Obviously the games would run at sub 1080p at times but that's preferable to only having an unstable 30 fps mode like before.
>>
>>
>>738749994
>PS4 games ran well
No they fucking didn't, the Jaguar dogshit CPU made sure no current game was hitting 60 fps even if the GPU had the leeway to do so and very regularly couldn't even hold 30 in complex scenes. The 3600 in the PS5/XSX isn't good but it's a desktop CPU which makes it infinitely better than the abomination that was the PS4/Xbone CPU.
>>
>>738750061
>>738750209
>>738750373
I'm embarrassed so im leaving the thread now desu
>>
>>
File: 1659715202346821.jpg (22.9 KB)
>>738749815
I mean yeah.
And a lot of the posters here if they even played SA when it came out, were young like you, so what the fuck did you even know about frame rates at the time.
Now all these kids are grown up and pretend like anything under 60fps is a slideshow, literally unplayable.
>>
>>738750746
>And a lot of the posters here if they even played SA when it came out, were young like you, so what the fuck did you even know about frame rates at the time.
It probably helped that when I played it on PS2, it was as a brit on a CRT. Our PS2 games largely ran at 25fps anyway to conform to PAL standards. Having a CRT meant less blur.
>Now all these kids are grown up and pretend like anything under 60fps is a slideshow, literally unplayable.
I really don't like playing 30fps anymore. I have a g-sync monitor that has a tolerance of like 48fps and above, so it starts to suffer below that. A few games have felt ok at lower framerates - Rockstar did a pretty good job making RDR2 feel good for a 30 fps game on PS4/PS5.
Despite having a 280hz monitor, my aim is usually for 120 fps in games.
>>
>>
File: 3mdxl9.jpg (39.2 KB)
>>738751140
A desktop R5 3600 would smoke it, in addition to the higher clock speeds of the desktop CPU the PS5 reserves 1 entire core for the OS and the memory latency is atrocious which has a massive impact on Zen 2 performance.
Still a massive improvement over the previous generation though, when the specs of the Xbone leaked in 2012 or whenever it was I thought it was fake at first, no way were they using such shitty architecture at such low clock speeds.
Even the (comparatively) faster Bulldozer cores overclocked to 4.5+ GHz were having a hard time keeping up with a stock i3 at the time.
>>
>>738746067
TLOU pushed the system. There were better running good looking exclusives, like Killzone 3. There's nothing wrong with TLOU, really. Yeah, it drops into 20's frequently, but controls don't feel terrible, and there are no huge stutters. It's unironically fine. I'd prefer to play the game in this state on original hardware to PS4's remaster, to be honest. I think the remaster is too high res for the art. And it looks wrong smoother.
>>
>>
>>738746067
They didn't. The Last of Us ran very well. Just like Shadow of the Colossus ran very well on the PS2.
It had good motion blur.
Good graphics are worth low framerate with that, and make it work, in 3rd person at least.
I never felt The Last of Us was performing badly.
Nor Heavy Rain. Nor Killzone 2 or 3. There was lag but it didn't make the games unplayable.
Home was good too. If that had a low framerate I didn't notice it.
The only game that had an unplayably low framerate I experienced was FFXIII-2.
Why was the PS3 so fucking good sometimes?
Who the fuck knows.
Quincunx maybe. NVIDIA were high quality once upon a time. Maybe there is something in how the CPU worked too. Like one controller and 6 underlings. Was 8 underlings but Sony was greedy and degenerate. Smashed one SPU on purpose so chips with defective spus could be sold. And made one SPU locked to system.
Nothing of the 7th gen should have been at 720p. That was a fuckup. Lower res is better. We could have saw the graphics in the lies. We only are starting to now.
The Last of Us was fantastic. And the PS3 version is the best version. All others look worse. Maybe how it is on an older system needs to be preserved because the ability to impress might be tied to that. Only impressive when pushing hardware then. Otherwise has to compete with the stuff of the next gen. Be more innovative. And can't really be without ruining what it was.
PS3 version has better shading.
It was great and new to look around like that. Like just explore. Take your time looking around. It was enjoyable enough to look around. Look at things. It was a very pretty game. Rage was too, for similar reasons but that is another story.
The story of TLOU is concluded in 1. Without the DLC. Anyone playing the DLC, Sequels or Part 1/Part 2 are bad people.
>>
File: 2452094343.gif (190.6 KB)
>>738749994
>zoomer wants me to just forget that ps2 games ran at 60 fps
>all because his tranny movie console reputation is at stake
>>
>>738746067
Never use that degenerate YouTube channel. Those homosexual pedophiles who preyed on kids, pushed the idea resolution is quality. They are some of the stupidest parasites on the planet and the world is worse because of them.
>>
>>
>>
>>
>>738747710
> to use two Cell CPUs in tandem, no GPU, but this was abandoned last minute, and they ended up buying Nvidia chips.
It was abandoned because not even them knew how to make modern graphics with only the CPU kek.
They were so obsessed with their CELL plan of putting it everywhere,toasters,refrigerators,Tvs,etc that they forgot they were making a videogame console.
>>
File: 1770564303912934.gif (3.5 MB)
>>738746130
>>
>>738754459
>It was abandoned because not even them knew how to make modern graphics with only the CPU kek.
Sounds about right.
>>738752852
A lot of PS2 games also didn't run at 60.
>>
>>738754459
>>738747710
On another note the PS3 GPU was close to the best of 2005.
RSX was a cross between the GeForce 7800 GTX (matching its shader count) and the 7600 GT (due to its 128-bit memory bus and 8 ROPs).
>>
>>738754459
I doubt that. But they should have gone with the 8800gtx Ultra. Which was 3x more powerful than the chip they did use.
Shared memory is obviously better, but splitting the memory does at least force Devs to use the CPU. Instead of all the Ram on textures.
>>
>>
Maybe a good idea to have a couple CPUs though. As all the innovative stuff would be done in CPU. Instead of conforming to GPU bullshit.
So no downgrades. No conversion of work print to GPU nonsense.
Just sounds better. But would need a really good CPU. Cell wasn't. I would have liked to see Getaway. As it was. GPU shit is not impressive. And fair enough to cancel the ahit they were developing. It looked bad.
Some games still looked good, but a lot was done on the GPU. Like for Killzone 2. Which did post processing or something, motion blur, maybe lighting. The way they did it is a little documented. Shown off in presentations. But I don't remember much.
I guess ideally things are done in sequence. Rather than back and forth. But no downtime. If there is more time then a better job is done. Or like in Killzone 2 space is left for action so slowdown isn't so severe during awesome effects and characters on screen.
>>
>>738754851
>“For a while, [PS3 had] no GPU, it was going to run everything with SPUs. The ICE team proved to Japan that it was just impossible. It would be ridiculous. Performance-wise, it would be a disaster. That’s why they finally added the GPU, closer to the end.” - A developer from Naughty Dog
And the ICE Team was an elite R&D group within Naughty Dog.Essentially, they were Sony's "technical wizards" who optimized hardware to its absolute limits.
>>
>>
>>
>>738755252
*A lot was done on the CPU
I hate downgrades. Games should NEVER go through an optimisation process. Instead always use what the artists had already made. They would already have had hardware in mind. What hardware should and would be. Butchering it all for bad compromises destroys nearly all ability for it to be impressive. No wonder not many good artists are involved with games. It'd be soul destroying to see something you did perfectly be turned into a bad joke. Or worse. Like seeing your kid be crippled.
>>
>>
>>738755608
>M$ was smart and went with ATI to get something more cutting edge.
All three of them have made the mistake of trusting Nvidia. Nintendo are the only ones that made another console using them, though. MS dropped them after the OG Xbox, Sony after the PS3.
>>
>>
>>738755959
Not if you want an integrated system you control. Nvidia are notorious for being uncooperative in documenting how their chips work in B2B sales. Nvidia then demand a service payment to keep the software up to date. They're also very cagey about not letting you alter their chips in any way. It's why Apple stopped using Nvidia chips in their Mac computers. They refused to explain how they were working, and refused to let Apple create a custom component for a chip to do 5k image output.
>>
>>
>>
>>738756387
So if the developers don't even have full control over their platform are the consumers gonna have a "good experience"? Look at the fisher price tablet that came out in 2017. It was barely on par with with a 360 that came out in 2005. It's time to leave the cult, leather jacket man isn't your friend.
>>
>>
File: 1733016724122818.jpg (566.7 KB)
>>738754723
>Series X equivalent to 6800XT
>>
>>
>>
>>738756613
>It was barely on par with with a 360 that came out in 2005
switch is a bit more powerful and more efficient at least. Also a lot more RAM. RDR proves this since it runs at 1080p and always at 30 FPS
>>
>>
>>
>>
>>
>>
>>
>>
>>
File: file.png (770.8 KB)
>>738762098
>>738762101
How didn't it?
>>
>>
>>738746067
https://youtu.be/m7M8lBKskx4?si=Duwy_FdnnTCQJuUc
>>
>>
>>
>>
>>
>>
>>
File: 1625329617632.png (443.3 KB)
>>738762684
>unlimited modding
>choice to use keyboard or controllers
>can upgrade my system based on what games i want to play
>no need to pay monthly fees for online
>can alt tab in a second and shitpost on a thread like this
im smoking some GOOD shit anon.
>>
>>
>>
>>738746067
performance standards were the lowest they'd been since N64 because 7th gen consoles were massively outpaced by all other tech but developers still wanted to use all the advances made in photorealistic graphics rendering even if games ran like cold molasses because of it.
by 2011 most console games ran like shit and if you wanted better you were expected to play on PC (even midrange PCs were destroying consoles at this point) or wait for a remaster for next gen consoles.
the TLOU "remaster" came out only a year after the original, most likely the game was in development for both PS3 and PS4 at the same time.
>>