Incorrect glyph scaling in my terminal emulator

50 Views Asked by At

I'm writing a terminal emulator in C++20, just for fun My objective is to make an exact DOS VGA text mode emulation (80x25 characters), but I want to let the user choose the window resolution. I'm using a 1280x720 window for my tests, and the font used is ModernDOS8x16.ttf (monospace). For rendering, I'm using SDL2 library with my custom OpenGL 3.3 code

Here are the important parts of my code

class Grid
{
    public:
        // ... omitted
        
    private:
        // Total environment dimension in pixels
        // (in this case, the size of the window)
        int m_envWidth, m_envHeight;

        // Number of cells
        int m_numCellsX, m_numCellsY;

        // Width and height of each cell
        // (in pixels)
        float m_cellWidth, m_cellHeight;
};

void Grid::initFromNumberOfCells(int numCellsX, int numCellsY)
{
    m_numCellsX = numCellsX;
    m_numCellsY = numCellsY;
    // Calculate the size of each cell with floating point precision
    m_cellWidth  = static_cast<float>(m_envWidth)  / static_cast<float>(m_numCellsX);
    m_cellHeight = static_cast<float>(m_envHeight) / static_cast<float>(m_numCellsY);

    this->makeCellRectangles();
}

void Grid::makeCellRectangles()
{
    for (int y = 0; y < m_numCellsY; y++)
    {
        const float posY = static_cast<float>(y) * m_cellHeight;

        for (int x = 0; x < m_numCellsX; x++)
        {
            const float posX = static_cast<float>(x) * m_cellWidth;

            m_cellRects.push_back({ posX, posY, m_cellWidth, m_cellHeight });
        }
    }
}

This gives me a cell size of 16x28.8 px

// A single character in the framebuffer
struct alignas(4) Character
{
    // Codepoint in UTF-32
    std::uint32_t codepoint{ 32U };
    // Foreground/background color
    std::uint16_t fg{ 15U }, bg{};
};

// Main framebuffer class
class Framebuffer
{
    public:
        // ... omitted
    
    private:

        // Emulated terminal size in
        // number of columns and rows
        // Default: 80x25
        // https://en.wikipedia.org/wiki/VGA_text_mode
        Size<int> m_terminalSize;

        // Array of character information
        std::vector<Character> m_framebuffer;
};

// Write a single character at the specified position
void Framebuffer::putCharacterAt(std::uint32_t codepoint, std::uint16_t fg, std::uint16_t bg, int x, int y)
{
    // Get the index that corresponds to the cursor position
    auto index = static_cast<std::size_t>((m_terminalSize.width * y) + x);

    // Put the character there
    m_framebuffer[index] = Character{ .codepoint = codepoint, .fg = fg, .bg = bg };
}

This works fine. The problem is in the character rendering. I want each character to be rendered inside a grid cell, to be centered, and to have a proper size. My bitmap font is baked at a bigger size, so I must scale it down. But I can't figure out how exactly should I do this. I'm currently getting a ratio and using the maximum value for the scale, and it works fine for some characters like 'a', but not for others like 'i' or ';' (screenshot at the bottom of the post)

void Framebuffer::renderSingleCharAtCell(const BitmapFont& font, Character character,
                                         const Texture2D& textureSprites,
                                         const Rect<float>& cellSize, const ColorPalette<16>& palette,
                                         PrimitiveProcessor<Quad>& processor,
                                         VertexBatch& batchText, VertexBatch& batchSprites) const
{

    // If the glyph is not found, replace it with a question mark
    if (!hasGlyph(font, character.codepoint))
        character.codepoint = 63U;

    // Get glyph data
    const auto& glyph = getGlyphData(font, character.codepoint);

    // Easy to remember-shortcuts
    const auto& glyphSize = glyph.uvCoords;

    // Calculate a ratio between
    // the glyph and the cell sizes
    const Vec2 glyphRatio
    {
        glyphSize.width  / cellSize.width,
        glyphSize.height / cellSize.height
    };

    // Picked the maximum
    float picked = std::max(glyphRatio.x, glyphRatio.y);

    // Draw rectangle for this glyph
    // These are the sprite coordinates where it should be rendered
    Rect<float> glyphRect
    {
        cellSize.x,
        cellSize.y + cellSize.height - (static_cast<float>(glyph.bearingY) / picked)
        glyphSize.width  / picked,
        glyphSize.height / picked
    };

    // Get font texture
    auto& fontTexture = *font.getTexturePtr()

    // Get color indices
    auto fgColorIndex = static_cast<std::size_t>(character.fg);
    auto bgColorIndex = static_cast<std::size_t>(character.bg);

    // Render the background
    processor.addPrimitive(batchSprites, { cellSize, opengl::calcNormalizedUV(textureSprites, WHITE_BOX_RECT), palette.getColorAt(bgColorIndex) });

    // Manually render the glyph inside the cell
    processor.addPrimitive(batchText, { glyphRect, opengl::calcNormalizedUV(fontTexture, glyph.uvCoords), palette.getColorAt(fgColorIndex) });

}

My question is: How should I calculate the correct proportion for each glyph? Here are my results so far:

terminal

0

There are 0 best solutions below