Originally I was using the code outlined here, and that provided me with enough to prove that I could do this. And it displayed 24-bit surfaces perfectly fine. But when I went to use it to display 32-bit images, I was getting really weird results. I searched around for a while on the internet and tried a lot of different things, and determined that it wasn't a problem with SDL or with wxWidgets per se, but rather, the conversion between the two. Namely: the SDL pixel data for 32-bit images followed the pattern of "RED|GREEN|BLUE|ALPHA" for each pixel. But for images with alpha components (32-bit depth), wxWidgets expects two different arrays: one for the RGB data, and then a second for the alphas. This took a somewhat decent amount of time for me to figure out, and while this isn't a huge thing, I figured that I couldn't be the only person who would encounter this problem, so I wanted to put this out there to help you guys.
Code: Select all
wxBitmap toBitmap(SDL_Surface* surface)
// If our depth is 24, then we don't need to do anything special. There's no alpha channel for us to take care of.
if (surface->format->BitsPerPixel <= 24)
return wxBitmap(wxImage(surface->w, surface->h, static_cast<unsigned char*>(surface->pixels)),
// If we're working with something that is 32 bit depth, then we need to filter out the alpha and RGB values into
// two separate arrays. This is going to be a somewhat heavy process.
unsigned char* srcPixels(static_cast<unsigned char*>(surface->pixels));
unsigned char* rgbPixels = new unsigned char[surface->w * surface->h * 3];
unsigned char* alphas = new unsigned char[surface->w * surface->h];
for (uint32_t y = 0; y < surface->h; ++y)
for (x = 0; x < surface->w; ++x)
rgbPixels[rgbIndex++] = srcPixels[srcIndex++]; // R
rgbPixels[rgbIndex++] = srcPixels[srcIndex++]; // G
rgbPixels[rgbIndex++] = srcPixels[srcIndex++]; // B
alphas[alphaIndex++] = srcPixels[srcIndex++]; // A
return wxBitmap(wxImage(surface->w, surface->h, rgbPixels, alphas, true));
NOTE: Because all of the above was just a test to see if this would even work (it did -- 32bit SDL_Surfaces now display properly and include alphas), I didn't write it at the time for anything close to optimisation. Like I said above as well, this side program, I'm not worrying too much about optimisation, but if you want to use this for a program, you might want to see if you can optimise this. Additionally, there will be memory leaks from using this function; I haven't gone back and cleaned up this function yet.
So I'm posting this to hopefully help others out, and then also: if anyone finds a more optimal way to do this, please let me know!