How to improve image quality (PPI) of takeSnapshotAsync

Hey,

I’m using takeSnapshotAsync to save a snapshot of a composite view (an image with a ‘sticker’ view on it a la snapchat)

The code is almost copy+paste from the docs:

takeSnapshotOfImageContainer = async (type: 'file' | 'data-uri') => {
    const targetPixelCount = __DEV__ ? 640 : 1080; // We want full HD pictures in prod
    const pixelRatio = PixelRatio.get(); 
    const pixels = targetPixelCount / pixelRatio;

    const result = await takeSnapshotAsync(this.imageContainer, {
      result: type,
      height: pixels,
      width: pixels,
      quality: 1,
      format: 'png',
    });
    return result;
  };

The results I get for resolution is correct. For example, an image taken with my Nexus 6P has the resolution of 3648 × 2736 with 180 ppi (pixels pr inch). When I use that photo and takeSnapshotAsync, I get an image of 1080x1080 - however, the ppi changes to 72, making the image a bit grainy. The result is for sharing on SoMe, so that sucks.

My question is, is there any way to maintain PPI? I can’t see any handles besides ‘quality’, so I find it strange that the quality is decreased. Tagging @ide , as you seemed knowledgeable about the issue when I asked how to correctly save to a desired resoltion :smiley:

Hi @jhalborg - I’m slightly confused by your question, since you said that the resulting image did have the correct resolution, 1080x1080 px. The ppi is not an inherent property of an image, but rather can change depending on how & where the image is displayed. You should be able to control the ppi (to some extent) when displaying an image.

You may want to try multiplying targetPixelCount and pixelRatio, rather than dividing them, to produce pixels. This would give you a resulting image that is actually larger than 1080x1080 px, but you could display it at 1080x1080 “logical”/“device-independent” pixels at a larger resolution.

Thanks for getting back to me @esamelson :slight_smile:

My use case is as follows:

  1. User picks an image, let’s say a travel image
  2. The user can now place a logo as a ‘sticker’ on the image, let’s say Ryan Air
  3. I want to save the result by snapshotting the containing view for the user to share it on SoMe

My question rephrased would therefore be: How can I make sure that no picture quality is lost in the snapshotting process? I just targeted 1080p, as that is what Instagram allows and it should look just fine, but the results are quite grainy.

Here are some examples

EDIT: Better examples. Also, for a moment I thought maybe the phone’s screen resolution might have an impact, but that can’t be it, as the images above are generated on an iPhone 8 which has a pretty great screen.

Friendly ping, @esamelson :slight_smile:

One way of improving quality might be passing a higher pixel target to the method than 1080p, but that’d create a new problem AFAIK. If, for example, the user’s photo actually is 1080x1080 and I pass 2160x2160 to the takeSnapshotAsync method, wouldn’t that create a somewhat distorted image?

Just tried it with a targetPixelCount of 2000, and the resulting image PPI is still exactly 72 PPI (measured by the Mac Preview app). I tried on both an iPhone 8 and a Nexus 6P, same result regardless.

So it seems to me that the takeSnapshotAsync might be hardcoded to save snapshots to a set quality of 72 PPI? Maybe this should actually be a github issue, @esamelson ?

Interestingly, if passing 0.5 as quality, I get the same resulting pixels and PPI - so maybe the quality option is broken?

@jhalborg - I think there is a misunderstanding here. Unless you are printing your screenshots, the file’s PPI metadata will have zero impact on how the image is displayed. See here for more information: DPI & PPI are irrelevant, only pixel dimensions matter online | FGWeb

The results you’re getting from quality are expected – quality should not change the actual physical size of an image, but only how much its data is compressed.

If you want your image to look crisper, you need to either display the 1080x1080 screenshot at a smaller physical dimension (e.g. 540x540 “logical”/“device-independent” pixels) or take a screenshot with a larger actual pixel size (e.g. 2160x2160).

@esamelson - I see, that makes sense. PPI is the wrong metric.

However, there’s a clear downgrade in picture quality, no matter how many pixels I use.

Here is another example with a picture of my Macbooks keyboard and speaker grill, complete with a lot of finger grease and a small hair :smiley: It was taken with an iPhone 8, and the original has a size of 4032x3024 pixels and is very sharp.

I then tried two settings for takeSnapshotAsync :

Setting 1: Quality 1, 2000x2000 pixels
Setting 2: Quality 1, 3024x3024 pixels (same height as original)

Both results are noticeably more blurry than the original, in spite of setting 2 being close to the original in pixels. Is there anything that can be done to preserve the quality of the original in the snapshot?

@jhalborg - if you compare the result of takeSnapshotAsync with a manual screenshot of the same screen on your device (e.g. on an iPhone, hold down power + press home), is it significantly different?

I think so, I added the screenshot as the last image on the album. It looks a lot clearer to me

Where do I go from here to get less blurry results from takeSnapshotAsync @esamelson ? :slight_smile:

1 Like

@esamelson - Friendly ping :slight_smile: Should I create a Github issue instead?

Hi @jhalborg - sorry for the silence here. Yes, if you could create a github issue with the relevant information from this thread and an easy repro case, that would be great :slight_smile:

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.