After radio waves and UV, it’s back to visible light, says S.Ananthanarayanan.
The Large Synoptic Survey Telescope (LSST), being set up in Chile, South America, is an 8.4 metre-wide reflector that focuses images of a wide swathe of the sky onto a 189 segment, 3,200-megapixel detector. Once in action, it would be able to image a golf ball from 24 kilometers away and its sensitivity would be a 100 million times greater than that of the human eye.
This is a long way from the best optical telescopes that we have made so far. To get the best of both magnification as well as detail, the optical telescope need lenses of the finest quality and of the largest diameter possible. Deficiencies that are inherent in glass lenses become serious when the lenses become very large. This fact brought in the mirror, or reflecting telescopes, with very large aperture, and faint images were made visible with high-grain photographic film and long exposures.
However, there are limits, both to the detail and the sensitivity, when we use visible light. Detail is limited by the wavelength of visible light, given the dimensions of lenses or mirrors that are practical. And for sensitivity, much of the visible light from distant objects gets scattered before it reaches the earth. The most distant objects are thus too dim to be seen.
A way to overcome sensitivity was to spot the most distant objects by the radio waves they emit, rather than light. As radio waves have wavelengths thousands of times greater than light, they are scattered much less, and it is by radio telescopes that the farthest objects have been observed. In place of the photographic plate or the eye, the detection of radio images is by an array of radio antennas that is spread over a large area. The timing of the waves that each antenna receives helps work out where the waves would have focused, if there had been a lens, and this helps develop the image.
While the larger wavelength of radio waves helps detect faint signals, the same large wavelength reduces the detail that can be detected. This. However, is compensated by the large area over which signals are collected, which has the effect of a very large lens or mirror, in the visible range.
As for the limit to detail that visible light presents, a way around was by using ultra violet light, or even X rays. Now, the atmosphere is opaque to UV or X rays. These telescopes hence had to be stationed outside the atmosphere, which became practical with satellites placed in orbit around the earth. The celebrated Chandra X ray telescope has been in orbit around the earth since 1999 and has provided us with detailed images of high-energy processes, like supernovae and the surroundings of black holes.
Along with the development of telescopes, was the progress in cameras. Cameras, of course, even when they are used at the focus of a telescope mirror, must use lenses. While the lenses were improved, for larger diameter, or aperture, to admit more light, the quality of the film on which the image was captured was also refined. While we are yet to have a medium that separately captures each photon of light, the grains of chemical, which capture the presence of light, were made finer, to capture more detail. And with long times of exposure, even dim images can be made visible.
Photo film has now made way for the array of electronic, light sensitive specks of silicon oxide on a sliver of silicon. In the photo The earliest telescopes, naturally, were visible light telescopes where the detector was the human eye. Galileo is credited with the first astronomical observation, using the newly invented device, when he saw the four largest satellites of Jupiter. Telescopes rapidly improved in quality, with lenses that compensated for the splitting of light into component colours, when it passed through glass, as well as the non-uniform magnification, because the lenses were not the ideal, ‘thin’ lenses. The limitations of lenses found an answer in using mirrors in the place of lenses and the largest telescopes we have are now the ‘reflectors’.
film, the bright spots of an image cause chemical change, and the image is recorded on the film. With the electronic sensor, the bright spots of the image cause build-up of charge, in the specks on the sliver of silicon. These charges are then transferred to a collector, which creates a digital record of how much light fell on each of the specks in the silicon screen. The record can then be used to display the image for the user of the camera to frame the picture, or to save the record, when the picture is clicked.
The resolution, or the detail, in the photo film depended on the fineness of the chemical specks deposited on the film. In the digital camera, the equivalent is the number of light sensors that act to capture the image. If the device has 1,000 rows and 1,000 columns of sensors, there would be a million sensors in all, and we would speak of a megapixel camera. We are familiar with modern mobile phones that sport 8 megapixel cameras – their pictures are as good as excellent conventional photographs. The resolution of professional Digital cameras goes as high as 50 megapixels.
In the context, the camera of the LSST would take pictures with 3,200 megapixels. The resolution actually works out to 3,024 megapixels, but this is the equivalent of an array with 55,000 pixels in each direction. The imaging is to happen over 189 separate sensors, each capable of 16 megapixels (189x16=3,024). While the sensor of a normal digital camera is about 1.4 inches across, the panel of 189 sensors stretches to more than 2 feet. The sensitivity and detail then equal what is feasible with much longer and shorter wavelengths – and over a range from the near infra-red to the ultra-violet.
The LSST itself is planned to be in action in 2022, but the sensor array, for the camera, has been assembled and was recently tested. As the sensors are so sensitive, they need to be protected from all kinds of stray radiation, or ‘noise’. To this end, the array is cooled to 100°C below freezing. As the telescope is yet to be built, the camera was tested by taking a picture of a head of broccoli – and the picture it took showed the amazing detail that it is capable of.
From its position in the Vera C. Rubin Observatory currently under construction in Chile, the LSST, will survey the entire Southern Hemisphere sky every few days, and go on for the next ten years. Data amounting to fifteen terabytes would be collected every night, and within a month, the LSST would have recorded more detail of the cosmos than all previous astronomical surveys.
The field of astronomy is teeming with puzzling phenomena, like dark matter and dark energy. “With the huge data that would become available with LSST, we could build a 3D of dark matter in the universe, and how it changes with time,” said a participant of a round table discussion of the difference the project is going to make.
Vera C Rubin and dark matter
Vera Florence Cooper Rubin (1928-2016) was an American astronomer who pioneered work on galaxy rotation rates. The periphery of a galaxy is found to move as fast as portions deeper within. This is unlike motion around a gravitational centre (the outer planets of the solar system, for instance, move slower than planets closer to the sun) and more like the movement of a disk. The anomaly is explained by invoking dark matter, as filling the space within the galaxy.
------------------------------------------------------------------------------------------ Do respond to : response@simplescience.in-------------------------------------------