FAQ: What is the difference in how projections are handled by Image Server and ArcGIS Desktop?


What is the difference in how projections are handled by Image Server and ArcGIS Desktop?


Image Server and ArcGIS Desktop use the same projection engine. Both products transform raster data by computing a grid that is bilinearly interpolated and refined until this grid reaches a predefined accuracy tolerance.

The default tolerance of the grid computed by Image Server is 0.5 pixels (in output space), whereas ArcGIS Desktop utilizes a lower tolerance value that translates into higher accuracy. Image Server densifies the grid until all errors are lower than the tolerance or until the number of grid cells exceed 10,000. An analysis of the extent of each request will take place to determine the density of this grid, and the densification stops once the accuracy or the maximum cells tolerance is reached.

The 10,000 limit is not reached in the vast majority of cases. The few cases in which it may take place occur when viewing data in a complex or curved projection at a small scale or in a projection that features an interruption in the display (for example, Fuller & Cube). In such cases, Image Server defaults to using approximately 10,000 cells, while ArcGIS Desktop reverts to using a by-pixel method. As a result, using such projections with Image Server may display some small steps at the edges.

Image Server enables the accuracy of the calculated grid to be controlled using the Sample2D process, which can be added to the Raster process chain of an image. This process not only defines when the sampling should take place, but also the accuracy of the transformation. While the Image Server default values of 0.5 and 10,0000 are suitable for most applications, revision of these values may be required for analysis of surface models that need a higher accuracy, or if large extents of higher resolution imagery need to be exported.

If the default accuracy is insufficient for a particular service, this process should be applied to set the desired accuracy (for example, 0.01) and maximum number of cells (for example, 1,000,000) so as to increase the densification threshold.

Image Server always performs a single sampling of the imagery between input and output to mitigate the effects of sampling. The location of the 2D Sampler in the process chain also defines when the sampling takes place. In the default operation, sampling takes place just prior to mosaicking the imagery for display, but certain services may require a modification of this behavior.

For example, if a service applies a convolution filter, it would be advantageous to perform the convolution filter prior to sampling the image. As such, placing the 2D sampler before the convolution filter forces sampling to be performed first. Doing so may also benefit scenarios where visualization is applied to DEMs.

Another difference between Image Server and ArcGIS Desktop applications comes in how datum transformations are deployed. ArcMap use a single transformation between each input dataset and the output projection of the data frame. Since an image service may contain rasters defined in numerous projections with varying datums, it is not possible to prompt for the transformation to be used. As a result, Image Server deploys the AISDatums.txt file to define the transformation to be used between any two specified datums. Please review the ArcGIS Desktop 9.3 Help documentation titled, "Handling the projection transformations in ArcGIS Image Server", that is linked in the Related Information section below for additional information on the functionality of the AISDatums file.

Related Information