Proper "Retro" resolution

Started by PixelOutlaw, March 11, 2025, 08:28:46

Previous topic - Next topic

PixelOutlaw

If you want to get that proper "retro" look it is best to render the entire game to a small buffer to begin with then render THAT buffer scaled in your native resolution.
Quite often I see people just scailing sprites which ruins the effect because the game logic and transformations take place in the native resolution.

The idea is quite simple
The game and all logic takes place in the small resolution. You do 1 resolution for everything.
You blit that to an intermediate sprite (or buffer depending on your language)
You then blit that image or buffer into the native window scaling to the nearest integer power that'll allow it to fit.

Here is some BlitzMax code. (feel free to correct this code if you spot a bug - it's 2 Am for me here in Montana)
The image is a bit blurred thanks to forum image scaling shenanigans but rest assured, it's razor sharp integer scaled in real life.

buffet.png

Please provide your own "ship.png"
SuperStrict

' Virtual resolution
Const VIRTUAL_WIDTH:Int = 240
Const VIRTUAL_HEIGHT:Int = 320

' Initialize Graphics FIRST
Graphics 1024, 768, 0 ' Example resolution; adjust as needed

' Native resolution
Global SCREEN_WIDTH:Int = GraphicsWidth()
Global SCREEN_HEIGHT:Int = GraphicsHeight()

' Scaling factors (maintain aspect ratio)
Global scaleX:Int = SCREEN_WIDTH / VIRTUAL_WIDTH
Global scaleY:Int = SCREEN_HEIGHT / VIRTUAL_HEIGHT
Global Scale:Int = Min(scaleX, scaleY)

' Centering offset
Global offsetX:Int = (SCREEN_WIDTH - (VIRTUAL_WIDTH * Scale)) / 2
Global offsetY:Int = (SCREEN_HEIGHT - (VIRTUAL_HEIGHT * Scale)) / 2

' Create an off-screen render target AFTER Graphics is initialized
Global framebuffer:TRenderImage = CreateRenderImage(VIRTUAL_WIDTH, VIRTUAL_HEIGHT, 0)

Function DrawRectOutline:Int(x:Int, y:Int, width:Int, height:Int)
    Local rightEdge:Int = x + width - 1
    Local bottomEdge:Int = y + height - 1

    DrawLine x, y, rightEdge, y, 0  ' Top
    DrawLine x, y, x, bottomEdge, 0' Left
    DrawLine x, bottomEdge, rightEdge, bottomEdge, 0 ' Bottom
    DrawLine rightEdge, y, rightEdge, bottomEdge, 0 ' Right
End Function

Function main:Int()
    Local ship:TImage = LoadImage("ship.png", 0) ' Very important to set it to 0 so it doesn't blur pixels
   
    While Not KeyHit(KEY_ESCAPE)
       
        ' 1. Render to the virtual resolution framebuffer
        SetRenderImage framebuffer
        Cls
        DrawText "MR. CLOWN'S PARTY PANTS!!!", 16, 16
        DrawText "THIS IS A 240X320 BUFFER", 16, 32
        DrawRectOutline 1, 1, 240, 320 ' Very strange BlitzMax line bug won't draw top and left 0,0
        DrawImage ship, 64, 256
        Flip
       
        ' Reset rendering to screen
        SetRenderImage Null
        Cls
       
        ' 2. Draw the framebuffer scaled to native resolution
        DrawImageRect framebuffer, offsetX, offsetY, VIRTUAL_WIDTH * Scale, VIRTUAL_HEIGHT * Scale
        DrawText "This is a 1024x768 window - native resolution.", 32, 32
        Flip
    Wend
End Function

main()


One DEFUN to rule them all, One DEFUN to find them, One DEFUN to RETURN them all, and in the darkness MULTIPLE-VALUE-BIND them.

Steve Elliott

Nice.  :D

I've been thinking of bringing an old game I wrote to more modern graphics resolutions (the game ran in 640 x 480).
Win11 64Gb 12th Gen Intel i9 12900K 5.2Ghz Nvidia RTX 3070Ti 8Gb
Win11 16Gb 12th Gen Intel i5 12450H 4.4Ghz Nvidia RTX 2050 8Gb
Win10/Linux Mint 16Gb 4th Gen Intel i5 4570 3.6GHz Nvidia GeForce GTX 1050 2Gb
Linux Mint 8Gb Celeron 2.6Ghz UHD Graphics600
macOS 64Gb M4 Max 16C GPU 40C
Spectrum Next 2Mb

angros47

Rendering to a small buffer and then upscaling it would make the image pixelated, like it would be running an old game on a modern computer in full screen, but that would just look low-res, not "retro". Adding a post processor shader to simulate the distortion of a CRT monitor, the color bleeding, the interlaced lines would make it much closer to a game played on a retro hardware.

Before:

l16bck.png

After:
Screenshot 2025-03-12 at 16-40-53 LEVEL16 FULLSCREEN.png

Years ago I found a great shader by Dave Eggleston, for WebGL, to do exactly that. The site is gone, but I still have the shader (license is Creative Commons Attribution, so it can be shared as long as the author's name is mentioned):
#ifdef GL_ES
    precision highp float;
#endif

uniform vec3 iResolution;
uniform float iGlobalTime;

uniform sampler2D iChannel0;

uniform float iShowScanlines;
uniform float iBlurSample;
uniform float iLight;

uniform bool iCurvature;
uniform bool iFullScreen;

uniform float iGamma;
uniform float iContrast;
uniform float iSaturation;
uniform float iBrightness;

// post effects colour correct routine by Dave Hostkins on Shadertoy.
vec3 postEffects(in vec3 rgb, in vec2 xy) {
    rgb = pow(rgb, vec3(iGamma));
    rgb = mix(vec3(.5), mix(vec3(dot(vec3(.2125, .7154, .0721), rgb*iBrightness)), rgb*iBrightness, iSaturation), iContrast);

    return rgb;
}

// Sigma 1. Size 3
vec3 gaussian(in vec2 uv) {
    float b = iBlurSample / (iResolution.x / iResolution.y);

    uv+= .5;

    vec3 col = texture2D(iChannel0, vec2(uv.x - b/iResolution.x, uv.y - b/iResolution.y) ).rgb * 0.077847;
    col += texture2D(iChannel0, vec2(uv.x - b/iResolution.x, uv.y) ).rgb * 0.123317;
    col += texture2D(iChannel0, vec2(uv.x - b/iResolution.x, uv.y + b/iResolution.y) ).rgb * 0.077847;

    col += texture2D(iChannel0, vec2(uv.x, uv.y - b/iResolution.y) ).rgb * 0.123317;
    col += texture2D(iChannel0, vec2(uv.x, uv.y) ).rgb * 0.195346;
    col += texture2D(iChannel0, vec2(uv.x, uv.y + b/iResolution.y) ).rgb * 0.123317;

    col += texture2D(iChannel0, vec2(uv.x + b/iResolution.x, uv.y - b/iResolution.y) ).rgb * 0.077847;
    col += texture2D(iChannel0, vec2(uv.x + b/iResolution.x, uv.y) ).rgb * 0.123317;
    col += texture2D(iChannel0, vec2(uv.x + b/iResolution.x, uv.y + b/iResolution.y) ).rgb * 0.077847;

    return col;
}

void main() {
    vec2 st = (gl_FragCoord.xy / iResolution.xy) - vec2(.5);

    // Curvature/light
    float d = length(st*.5 * st*.5);
    vec2 uv = st*d + st*.935;

    if (! iCurvature) uv = st;

    // CRT color blur
    vec3 color = gaussian(uv);

    // Light
    float l = 1. - min(1., d*iLight);
    color *= l;

    // Scanlines
    float y = uv.y; // change this to st.y for non-curved scanlines.

    if (iFullScreen) {
    float s = 1. - smoothstep(360., 1440., iResolution.y) + 1.;
    float j = cos(y*iResolution.y*s)*.1; // values between .01 to .25 are ok.
    color = abs(iShowScanlines-1.)*color + iShowScanlines*(color - color*j);
    } else {
    color *= 1. - (mod(gl_FragCoord.y, 2.)*.25*iShowScanlines) + mod(gl_FragCoord.y+1., 2.)*.25*iShowScanlines;
    }
    color *= 1. + ( .02 + ceil(mod( (st.x+.5)*iResolution.x, 3.) ) * (.995-1.02) )*iShowScanlines;

    // Border mask
    if (iCurvature) {
        float m = max(0.0, 1. - 2.*max(abs(uv.x), abs(uv.y) ) );
        m = min(m*200., 1.);
        color *= m;
    }

    // Color correction
    color = postEffects(color, st);

    gl_FragColor = vec4(max(vec3(.0), min(vec3(1.), color)), 1.);
}

PixelOutlaw

I agree. Though for pixel art purists shaders should be optional.
I think the very first pass is what everyone gets wrong and that's rendering to a small buffer.

I see a lot of indie games these days that totally have multiple scales on sprites text and then other special effects. And it looks pretty awful to somebody that actually grew up playing games from the '80s and '90s. Kind of like a noisy visual collage where there's no harmony.

Another unfortunate aspect of a CRT shader is removing some of the color intensity but that's a bit more of a taste thing. Thanks for the shader code it's a good foundation.
One DEFUN to rule them all, One DEFUN to find them, One DEFUN to RETURN them all, and in the darkness MULTIPLE-VALUE-BIND them.

Steve Elliott

Yes shaders do give a more authentic look, although for my taste that shader is making things a bit too soft. You can always tweak it or add another pass I guess.  8)
Win11 64Gb 12th Gen Intel i9 12900K 5.2Ghz Nvidia RTX 3070Ti 8Gb
Win11 16Gb 12th Gen Intel i5 12450H 4.4Ghz Nvidia RTX 2050 8Gb
Win10/Linux Mint 16Gb 4th Gen Intel i5 4570 3.6GHz Nvidia GeForce GTX 1050 2Gb
Linux Mint 8Gb Celeron 2.6Ghz UHD Graphics600
macOS 64Gb M4 Max 16C GPU 40C
Spectrum Next 2Mb