'SharpGL Low Resolution Textures

I am loading textures in using the following code:

var texture = new SharpGL.SceneGraph.Assets.Texture();
texture.Create(gl, filename);

But when I render them onto a polygon they are extremely low resolution. It looks like about 100x100 but the source image is much higher resolution than that.

to add the texture I later call:

gl.Enable(OpenGL.GL_TEXTURE_2D);
gl.BindTexture(OpenGL.GL_TEXTURE_2D, 0);

That's all the texture commands I call other than supplying each vertex with a gl.TexCoord This all works fine but its just that the displayed image is very pixilated and blurry.

Is there some OpenGL setting that I must use to enable higher resolution textures?



Solution 1:[1]

So the answer was that the Create method in SharpGL.SceneGraph that creates textures is made to downsample textures to the next lowest power of 2 for height and width and does a real shitty job of it. For example an image that's 428x612 will get downsampled to 256x512... poorly.

I wrote this extension method that will import a bitmap into a texture and retain the full resolution.

public static bool CreateTexture(this OpenGL gl, Bitmap image, out uint id)
{
    if (image == null)
    {
        id = 0;
        return false;
    }

    var texture = new SharpGL.SceneGraph.Assets.Texture();
    texture.Create(gl);
    id = texture.TextureName;

    BitmapData bitmapData = image.LockBits(new Rectangle(0, 0, image.Width, image.Height), ImageLockMode.ReadOnly, PixelFormat.Format32bppArgb);
    var width = image.Width;
    var height = image.Height;
    gl.BindTexture(OpenGL.GL_TEXTURE_2D, texture.TextureName);
    gl.TexImage2D(OpenGL.GL_TEXTURE_2D, 0, OpenGL.GL_RGB, width, height, 0, 32993u, 5121u, bitmapData.Scan0);
    image.UnlockBits(bitmapData);
    image.Dispose();

    gl.TexParameterI(OpenGL.GL_TEXTURE_2D, OpenGL.GL_TEXTURE_WRAP_S, new[] { OpenGL.GL_CLAMP_TO_EDGE });
    gl.TexParameterI(OpenGL.GL_TEXTURE_2D, OpenGL.GL_TEXTURE_WRAP_T, new[] { OpenGL.GL_CLAMP_TO_EDGE });
    gl.TexParameterI(OpenGL.GL_TEXTURE_2D, OpenGL.GL_TEXTURE_MIN_FILTER, new[] { OpenGL.GL_LINEAR });
    gl.TexParameterI(OpenGL.GL_TEXTURE_2D, OpenGL.GL_TEXTURE_MAG_FILTER, new[] { OpenGL.GL_LINEAR });
    return true;
}

I suppose I wouldn't have encountered this problem if I had supplied image files that were already scaled to powers of two but that wasn't obvious.

Usage

if (gl.CreateTexture(bitmap, out var id))
{
    // Do something on success
}

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Mark Mercer