How to make certain pixels transparent on the uiimage pixel on the iPhone

Asked 2 years ago, Updated 2 years ago, 97 views

Hello.

I'd like to make certain pixels transparent on iPhone.

On Android, it became transparent if the alpha value was zero.

On iPhone, R, G, B, and A must be zero to make the pixel transparent.

I don't understand why this is because I don't have basic knowledge.

Does anyone know how to make it transparent with just zero alpha value? The picture is the code I'm writing as a sample.

- (UIImage *)processWithePixels:(UIImage*)teeImage alpha:(int)trans{

    UInt32 * teePixels;

    CGImageRef teeCGImage = [teeImage CGImage];
    NSUInteger teeWidth = CGImageGetWidth(teeCGImage);
    NSUInteger teeHeight = CGImageGetHeight(teeCGImage);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    NSUInteger bytesPerPixel = 4;
    NSUInteger bitsPerComponent = 8;

    NSUInteger teeBytesPerRow = bytesPerPixel * teeWidth;

    teePixels = (UInt32 *)calloc(teeHeight * teeWidth, sizeof(UInt32));


    CGContextRef context = CGBitmapContextCreate(teePixels, teeWidth, teeHeight,
                                                 bitsPerComponent, teeBytesPerRow, colorSpace,
                                                 kCGImageAlphaPremultipliedLast|kCGBitmapByteOrder32Big);



    CGContextDrawImage(context, CGRectMake(0, 0, teeWidth, teeHeight), teeCGImage);

    for (NSUInteger j = 0; j < teeHeight; j++) {
        for (NSUInteger i = 0; i < teeWidth; i++) {
            UInt32 * currentPixel  = teePixels + ( j * teeWidth) + i;


            UInt32 color = *currentPixel;
            int mColor = 0;

            if(color != 0){
                if(B(color) < R(color)){
                    mColor = B(color);
                }else{
                    mColor = R(color);
                }

                if(mColor > G(color)){
                    mColor = G(color);
                }

                mColor  = 255 - mColor;
                mColor = (int)(mColor * 2);
                if(mColor > trans ){
                    mColor = trans;
                }
            }else{
                mColor = 0;
            }


            CGFloat alpha = 255 - mColor;
            UInt32 newR = R(color) - alpha;
            UInt32 newG = G(color) - alpha;
            UInt32 newB = B(color) - alpha;
            UInt32 newA = A(color) - alpha;


            newR = MAX(0,MIN(255, newR));
            newG = MAX(0,MIN(255, newG));
            newB = MAX(0,MIN(255, newB));

            *currentPixel = RGBAMake(newR, newG, newB, mColor);

        }
    }

    // // 4. Create a new UIImage
    CGImageRef newCGImage = CGBitmapContextCreateImage(context);
    UIImage * processedImage = [UIImage imageWithCGImage:newCGImage scale:1 orientation:teeImage.imageOrientation];

    // // 5. Cleanup!
    CGColorSpaceRelease(colorSpace);

    CGContextRelease(context);

    CFRelease(newCGImage);
    free(teePixels);

    return processedImage;
};

objective-c ios pixel

2022-09-22 21:53

1 Answers

kCGImageAlphaPremultipliedLast

I think it has something to do with this. Alpha channel is using RGBA-type images at the very end of the bitmap, reading the images, and using RGBA.

I heard that ARGB is used in the current version. If R alone becomes transparent, it will be a problem caused by the difference between ARGB and RGBA.

kCGBitmapByteOrder32Big is also affected. It depends on whether it's a little endian or a big endian.

First, you should read CGIimageGetBitmapInfo=CGIimageGetBitmapInfo(teeCGImage) to see which bitmap type it is, and then select the appropriate context. Please refer to the CGIimageGetBitmapInfo here.


2022-09-22 21:53

If you have any answers or tips


© 2024 OneMinuteCode. All rights reserved.