DPI of an image - c#

I have a requirement where I have to check the DPI of a bunch of images stored in an oracle database as a blob and make sure they are above 300 dpi. I was planning on reading the field via a c# program and trying to determine their dpi, but i was unsure how to calculate this. Can anyone give me some guidance on how I can determine their DPI.

As the name says, DPI represents the number of pixels per inch. To calculate it you need the size of image in inchs and in pixels. The size in inchs is relative to the monitor screen dpi, best resolutions like printers use more points for inch.
So, if your image has 100px and 2in, your dpi is 100/2 = 50 dpi. Realize that the problem is how to determine the size in inchs of your image if basically you need to get an reference of inchs.
Then, would be easier to verify the dpi of image on upload and guarantee the required before the insertion.

Related

Windows DPI setting affects Graphics.DrawString

I have created a new Bitmap object and now want do draw some text to it using GDI+.
So I call Graphics.DrawString(...).
The problem is that the size of the string depends on Windows 7's DPI settings.
Is there any way to make my text drawing independent of the windows settings?
PS: The DPI settings seem to affect text only. A rect for example stys the same size when changing the DPI...
Just found the solution myself:
The key is to create the font with the parameter GraphicsUnit.Pixel. That way drawing strings gets independent from the system's DPI settings.
You are correct in that the DPI affects only drawable items that are measured in device-independent units. Fonts are typically measured in Points, where 1 point = 1/72 of an inch. Therefore a 10pt font will be the same size in INCHES on each and every screen resolution and will take up more or less pixels depending on the screen resolution and pixel density.
Everything measured in pixels (such as lines, shapes etc) will not be affected by DPI, but the actual physical size will vary depending on screen resolution and pixel density. Changing your code to measure fonts in pixels will indeed ensure that the text is the same pixel size on all screen DPI settings, but if you were to print to a printer you'll find that the text size will vary depending on printer resolution.

RenderTargetBitmap DPI for drawing application

I have been using RenderTargetBitmap to draw lines for my application as shown here.
It's working, but I don't quite understand the DPI. If I need the lines to draw under the mouse, I use a DPI of 96. All I need to know is whether this is the same for all devices and screens and, if not, how to find the correct one.
This is the code I use to get system DPI:
// Get system DPI
Matrix m = PresentationSource.FromVisual(Application.Current.MainWindow).CompositionTarget.TransformToDevice;
if (m.M11 > 0 && m.M22 > 0)
{
dpiXFactor = m.M11;
dpiYFactor = m.M22;
}
else
{
// Sometimes this can return a matrix with 0s. Fall back to assuming normal DPI in this case.
dpiXFactor = 1;
dpiYFactor = 1;
}
This will be the factor of normal DPI (96) the system has. The system will have a horizontal DPI of dpiXFactor * 96 and a vertical DPI of dpiYFactor * 96. You can test this on Windows 7 by going start menu -> "dpi" and selecting "Make text and other items larger or smaller". 100% means a factor of 1 and a DPI of 96. 125% means a factor of 1.25 and a DPI of 120. 150% means a factor of 1.5 and a DPI of 144.
The fallback logic is due to a customer crash report which I think could have only been caused by the transform matrix having all zeroes. There might be a more reliable way of getting system DPI (pinvoke, maybe?) but I don't know of it.
DPI is used when rendering non vector images. You dont have to worry about changing DPI's for different monitors or screens tahts determined by the OS and monitor hardware. What you do have to consider is the level of pixelation you want in your image. As a general rule of thumb small images under 100x100 pixel images should be ok with 96 DPI, any thing between 100x100 and 300x300 you should go 150-175 DPI. for anything larger than 300 that you can go 300-400 DPI. After a certain point its just overkill. Most commercial printing is done under 350 DPI unless your doing stochastic printing. If your drawing lines with a stroke of 1-3 you should be ok with 96 DPI, any larger I would suggest 150 DPI.
The DPI you use depends on what will be done with the output. If your drawing will be viewed only on the screen without zooming, you will be ok with 96dpi and you don't really need to worry about it anymore. If you need to zoom in on your drawing, then higher resolution can provide a better user experience. If printing your drawing is an important part of your user experience, then you'd probably be better served going with 300dpi.

The difference in image resolution (ppi) between C# and Photoshop

For example, C # says that the selected image contains 96 ppi, while that same image in Photoshop contains 72 ppi.
Why is there a difference?
I’m inclined to trust Photoshop in this case, and how to test image resolution if C# returns false results?
We need to build some sort of validator control that rejects all images with ppi != 300.
Control should support the following formats: jpg, jpeg, gif, png, bmp.
Code is listed below:
Image i = Image.FromFile(FileName);
Console.Write(i.VerticalResolution);
Console.Write(i.HorizontalResolution);
DPI means dots (pixels) per inch. The physical size in inches is subjective, based on the current monitor's size and resolution. Unless you're relying on metadata (which gif and bmp don't contain) you cannot reliably calculate this.
Photoshop simply has a prescribed value for DPI, which it uses when translating images for print. This value is stored in the PSD file and may be copied to JPEG metadata, but if you save the image in a format without DPI metadata, the information is not stored.
Update:
The reason your code gets a different value is that C# fetches its VerticalResolution and HorizontalResolution values from the current DPI setting on the computer. Photoshop's DPI is for use with print, so it knows the physical dimensions if you want to send your image to a printer. It has a default value of 72dpi, but you can change this. The value has no meaning on a screen, though, since screens deal in pixels only.
DPI means dots per inch. A bitmap image does not have an inherent DPI, it merely has a size which is the number of pixels in the horizontal and the number of pixels in the vertical (width and height). An image only gains a resolution (in DPI) when you say how many pixels you want to squeeze into each inch.
So if I have an image that is 100 pixels wide and 100 pixels high (100px × 100px), it will be 100 DPI if I print it (or convert it into a format that dictates the print size) such that it fits exactly in a one square inch (1" × 1"). It will be 50 DPI if I print it to fit in a square that is two inches by two inches, &c.

DPI and PPI | DPI OR PPI

Is there any differ between DPI and PPI ? Is it true : Dot = Pixel ? And at Last what is DpiX and DpiY in C#.net ? How can we change them ?
DPI
Stands for Dot Per Inch and is used when pritning. It is also heavily misused with screen resolution. A colour ink printer don't mix the colors, instead each color has its own "slot". When a printer says 12000 dpi it is actually 12000 / 5 "pixels" for a 5 color based printer. Never the less, DPI only becomes interesting in the context of a printed result. Photoshop use the image's DPI information to be able to tell you the printed size of the image. Many people missinterpret DPI as a measurement of the pictures image quality or pixel density. For example, if you change the DPI for an image in Photoshop without resampling the image, you still have the exact same amount of pixels but now photoshop shows another "physical" dimension (inch or whatever you use).
Look at this Wiki article.
PPI
Stands for Pixels Per Inch (or pixel density) and should be used with screens and defined as one pixel equals the area of all the three base colors (Red, Green and Bue). Look at this Wiki article.

C# WPF resolution independancy?

I am developing a map control in WPF with C#. I am using a canvas control e.g. 400 x 200 which is assigned a map area of e.g. 2,000m x 1,000m.
The scale of the map would be: canvas_size_in_meters / real_size_in_meters.
I want to find the canvas_size_in_meters.
The canvas.ActualWidth gives the Width in DIU's (Device Independant Units). So, 400 DIU's is 400/96 = 4,17 inches, PROVIDED that the physical resolution of my monitor is 96 dpi.
However, using a ruler, I found that the physical resolution of my monitor is 87 dpi. (There are only few monitors that ACTUALLY have 96 physical dpi)
That DPI difference (10%) translates to a +10% difference in the actual map control width on screen.
How do I measure the size of a WPF control in inches EXACTLY and regardless of screen resolution and DPI setting ?
How do I measure the size of a WPF control in inches EXACTLY and regardless of screen resolution and DPI setting ?
This isn't actually possible, because for it to work, WPF would have to know the resolution (in terms of DPI) of your monitor. Sounds nice in theory, but in practice windows doesn't know this information. This is why windows itself always assumes 96dpi blindly instead of being smarter about it.
Even if there were some way to manually tell it, or if your particular monitor has a custom driver that does pass the correct information to windows, this isn't going to work on anyone else's computer, so windows doesn't pass this information on to any applications.
The best you can do is draw a scale like google maps does. You know that 1 pixel == 1 mile, so you can draw a 50 pixel line on your map, with a label saying "this line equals 50 miles"
There is way to compute current pixel size in mm or inches. As mentioned in the earlier posts, it is not a fixed value and would vary depending on the current resolution and monitor size.
First get the current resolution. Assume it is 1280x1024
Now get the monitor width in mm using GetDeviceCaps function. Its a standard windows library function.
int widthmm = GetDeviceCaps(deviceContext, HORZSIZE);
My monitor width is 362mm
So pixel size = 362/1280 = 0.282 mm
The accuracy of this method depends on the assumption that the display area covers the width of the monitor exactly.
So to answer the original question, the canvas size of 400 x 200 pixels would be
(400 * 0.282/1000) x (200 * 0.282/1000) in meters when shown on my monitor.
Thank you for you prompt reply.
I totally agree, but I didn't want to believe it in the first place. You see, there has to be an approximate calculation of the scale of the map if the map is used to display different layers of map data (scale dependant).
Most applications use a slider control with e.g. 10 discrete map levels to set the "scale".
Having an absolute scale is not crucial for the application, it would be nice to display an indicative scale, like 1:15,000.
An absolute scale would require for an extra variable monitorPhysicalDPI (initially set to 96) that if the uses chooses to change would give slightly better scaling (again it's not crucial). The size of the map control would be:
map.ActualWidth * (96/monitorPhysicalDPI) * inchesPerDIU, inchesPerDIU is 1/96
Again these are cosmetics.. Wouldn't it be nice if Windows knew the ACTUAL control's dimensions? (user would have to give information about the screen dimensions on OS setup, or simply installing the monitor INF file)

Categories

Resources