# What would the M87* black hole image look like if we saw 230 GHz radio waves?

I originally posted this as a question and an answer on physics.stackexchange.com, but it was not received well there and was closed.

The Event Horizon Telescope (EHT) M87* black hole image is equivalent brightness temperature, $T_\mathrm{b}$, in Kelvins mapped to color using a perceptually uniform colormap from ehtplot. By Rayleigh–Jeans law, equivalent brightness temperature is proportional to specific intensity:

$$T_b = \frac{c^2}{2\nu^2k}I_\nu,\tag{1}$$

where $I_\nu$ is the specific intensity, $c$ is the speed of light, $\nu$ is the observing frequency (230 GHz), and $k$ is the Boltzmann constant (source: paper IV).

Based on the appearance of the black hole image, the lightness of the color per CIECAM02 color appearance model assumed to be perceptually uniform. Lightness does not fully define a color and allows to create arbitrarily customized colormaps such as the one used for the black hole image.

In perceptual photometry, radiant intensity $I_\mathrm{e}$ in watts per steradian (W/sr) is converted to luminous intensity $I_\mathrm{v}$ in candelas (cd) by:

$$I_\mathrm{v} = 683\,\overline{y}(\lambda)\,I_\mathrm{e},\tag{2}$$

where ${\overline {y}}(\lambda )$ is the standard luminosity function, which is zero-valued outside the visible spectrum (source: Wikipedia article on photometry).

Perhaps a more natural mapping of the monochromatic 230 GHz black hole image to a visible color image would be obtained by equating $I_\nu$ to $I_\mathrm{e}$, by replacing ${\overline {y}}(\lambda )$ of Eq. 2 by a non-zero value, and by creating an image that reproduces the luminous intensity $I_\mathrm{v}$, as if we were able to see 230 GHz radio waves. By "more natural" I mean that we have intuitive facilities for processing visual intensity as is, whereas in the published image, intensity was mapped to a perceptually uniform scale. Chromaticity could be left out by presenting a gray scale image. Question: What would the black hole image look like with such a colormap?

After obtaining the brightness temperature data, it could be converted from the linear intensity scale using the sRGB transfer function (from Wikipedia article on sRGB):

$$\gamma(u) = \begin{cases}12.92 u & u \lt 0.0031308 \\1.055 u^{1/2.4} - 0.055 & \text{otherwise,}\end{cases}\tag{3}$$

where $u$ is linear intensity, and saved in an image file for viewing.

One approach to obtain a brightness temperature image would be to invert (as in finding the inverse function) the colormap of the published image and to work from that. I downloaded the 16-bit/channel TIFF Fullsize Original eso1907a.tif obtained 2019-04-12 from ESO (they have since replaced the file with an 8-bit/channel TIFF, obtained 2019-04-14). There are a couple of things that make inverting the colormap difficult. Firstly, the image does not come with reliable information about the colormap, such as a colorbar. Secondly, the 16-bit/channel TIFF has 18992 unique colors (8-bit/channel TIFF has 67395), whereas I can only get 256 unique colors from pyplot.get_cmap('afmhot_u') in Python. This makes me believe that the image is not directly colormapped brightness temperature but has been altered afterwards, perhaps smoothed and/or contrast stretched. It is difficult to guess what processing might have taken place and in which color space, although the scientific nature of the image limits what would be responsibly done. An EXIF field shows that the image was saved with Adobe Photoshop CC 2019 (Windows), so the direct source is not a (Python) script. It is possible that just cropping or tagging was done in Photoshop. However, if we highlight the 10 % green channel contour and 50 % red channel contour of the 16-bit eso1907a.tif, most of the central shadow shows an inversion in the order of the contours, which means that the image is not directly from a colormap:

Figure 1. An analysis of 16-bit eso1907a.tif in GIMP by highlighting the contours in which the red color channel has 50 % value and the green color channel has 10 % value. Original image credit: Event Horizon Telescope.

Another approach would be to reconstruct the brightness temperature data from earlier data in the imaging pipelines. Katie Bouman's recent Caltech talk (published 2019-04-12) explains that the final image was a combination of images generated by three pipelines: DIFMAP, eht-imaging, and SMILI, each with an image from 2017-04-05, 06, 10, and 11, each image blurred so that they were similar enough as measured by normalized cross-correlation and all of the images averaged.

Based on this, there is yet another way to construct the final image, by averaging brightness temperatures of Fig. 15 of Paper IV:

Figure 2. Fig. 15 of Paper IV. The subfigures are averages of blurred images from the three imaging pipelines. Image credit: Event Horizon Telescope.

Conveniently, the figure has a labeled colorbar. The image format is JPEG, so there will be at least some compression artifacts in addition to the white overlaying graphics. We can load the image and separate the color bar and the four images, and compare the color bar to afmhot_u, using Python script 1 below. The result:

Figure 3. Red-green-blue (RGB) color channel values from the color bar of Fig. 2 (solid) and from the afmhot_u colormap (dashed), as function of normalized input value proportional to brightness temperature.

The knee points of the color channel curves (Fig. 3) of the color bar in Fig. 2 are approximately at the same input values as those of afmhot_u, which is further evidence that this colormap was used, but some knee points have been shifted horizontally. This could be some kind of an artifact from color space conversions with truncation of the red channel resulting in leakage to the other channels, but a proper interpretation of the shift is difficult. Because of the differences, an inversion of the afmhot_u colormap cannot be used.

Or one could ask the EHT imaging team for the brightness temperature data.

We can scrape the colormap from the colorbar in Fig. 2, convert each of its subfigures by picking for each pixel the brightness temperature that gives the least mean square error in the RGB triplet, average the brightness temperature subfigures, and save the result using the afmhot_u colormap (Fig. 4) and as intensity in sRGB using Eq. 3 (Fig. 5). This is done by Python script 2 below.

Figure 4. Brightness temperature (in units of $10^9$ K) of the reconstructed final black hole image colormapped using afmhot_u. Horizontal and vertical axis are in pixels of the same size as in Fig. 2. Processed from image credit: Event Horizon Telescope.

Figure 5. Brightness temperature of the reconstructed final black hole image saved as proportional intensity using the sRGB transfer function. Processed from image credit: Event Horizon Telescope.

The subfigures were assumed to be registered already beforehand. Horizontal registration may have slight additional error introduced by cropping of the original figure to subfigures at whole pixel resolution.

The reconstructed image in Fig. 4 looks quite similar to the published first black hole image, but the central shadow in the reconstructed image is not as pronounced.

The central shadow is even less pronounced in Fig. 5 which answers the question for now. EHT has since published the imaging pipelines, which should allow proper reproduction of the final brightness temperature image.

## Python script 2

(continues Python script 1)

This isn't particularly optimized, but will be fine for one-off conversion.