Home
IMAQ Vision Concepts Manual
Contents
1. Measurement Symbol Sum x Ly Sum y 2 Sum xx Dies Sum xy y Sum yy Lyy Sum xxx Dii Sum xxy Zaxy Sum xyy 2ayy Sum yyy Lyyy National Instruments Corporation 10 17 IMAQ Vision Concepts Manual Chapter 10 Particle Measurements Moments Table 10 10 lists the IMAQ Vision particle moment measurements Table 10 10 Moments Measurement Symbol Equation Moment of Inertia xx La y yo XX A Moment of Inertia xy Ixy II 2 y E A Moment of Inertia yy ly z Di e ae A Moment of Inertia xxx Es E 2 Lexy IX 2X LX Moment of Inertia xxy Lexy 2 Lexy 2kLyy YE y 2X LD Moment of Inertia xyy Lyy E 7 2 Ey 7 292 A 29 2 Moment of Inertia yyy Lyy e 275 Norm Moment of Inertia Nix Lex XX Y Norm Moment of Inertia Ny Ly xy A Norm Moment of Inertia Nyy E yy 2 Norm Moment of Inertia Ns Ls XXX Ao IMAQ Vision Concepts Manual 10 18 ni com Chapter 10 Table 10 10 Moments Continued Measurement Symbol Equation Norm Moment of Inertia Nuxy y Xxy Ae Norm Moment of Inertia Nags D xyy ger Norm Moment of Inertia Nyyy Dons yyy Aer Hu Moment 1 Hy No EN Hu M t2 H u Momen 2 Na Nyy AN Hu Moment 3 H 2 3 Naxx 3N y A Nyy Hu Moment 4 H 2 4 Nay EN gy Neay Nyy 2 Hu Moment 5 H espe Nyy N mt Noy UN gy HN yyy ae Nyy BN yxy Ny Ny EN yyy 3 N
2. Figure 3 6 Origin and Angle of a Coordinate System O National Instruments Corporation 3 9 IMAQ Vision Concepts Manual Chapter 3 System Setup and Calibration IMAQ Vision Concepts Manual The second axis direction can be either indirect as shown in Figure 3 7a or direct as shown in Figure 3 7b The indirect axis orientation corresponds to the way a coordinate system is present in digital images The direct axis orientation corresponds to the way a coordinate system is present in the real world Y X a Indirect b Direct Figure 3 7 Axis Direction of a Coordinate System If you do not specify a coordinate system the calibration process defines a default coordinate system as follows 1 The origin is placed at the center of the left topmost dot in the calibration grid 2 The angle is set to zero This aligns the x axis with the topmost row of dots in the grid 3 The axis direction is set to indirect This aligns the y axis to the leftmost column of the dots in the grid If you specify a list of points instead of a grid for the calibration process the software defines a default coordinate system as follows 1 The originis placed at the point in the list with the lowest x coordinate value and then the lowest y coordinate value 2 The angle is set to zero The axis direction is set to indirect If you define a coordinate system yourself remember the following e Express the origin in pi
3. O National Instruments Corporation Figure 14 16 Overview of the Color Pattern Matching Process 14 23 IMAQ Vision Concepts Manual Instrument Readers This chapter contains information about instrument readers that read meters liquid crystal displays LCDs and barcodes Introduction Instrument readers are functions you can use to accelerate the development of applications that require reading meters seven segment displays and barcodes When to Use Use instrument readers when you need to obtain information from images of simple meters LCD displays and barcodes Meter Functions Meter functions simplify and accelerate the development of applications that require reading values from meters or gauges These functions provide high level vision processes to extract the position of a meter or gauge needle You can use this information to build different applications such as the calibration of a gauge Use the functions to compute the base of the needle and its extremities from an area of interest indicating the initial and the full scale position of the needle You then can use these VIs to read the position of the needle using parameters computed earlier The recognition process consists of two phases e A learning phase during which the user must specify the extremities of the needle e An analysis phase during which the current position of the needle is determined The meter functions are designed to wo
4. A Square Root or Power 1 Y transformation where Y 2 produces the following image and histograms O National Instruments Corporation 5 5 IMAQ Vision Concepts Manual Chapter 5 Image Processing A Logarithm transformation produces the following image and histograms IMAQ Vision Concepts Manual Exponential and Gamma Correction The exponential and gamma corrections expand high gray level ranges while compressing low gray level ranges When using the gray palette these transformations decrease the overall brightness of an image and increase the contrast in bright areas at the expense of the contrast in dark areas The following graphs show how the transformations behave The horizontal axis represents the input gray level range and the vertical axis represents the output gray level range Each input gray level value is plotted vertically and its point of intersection with the look up curve then is plotted horizontally to give an output value 250 200 150 100 50 The Exponential Square and Power Y functions expand intervals containing high gray level values while compressing intervals containing low gray level values The higher the gamma coefficient Y the stronger the intensity correction The Exponential correction has a stronger effect than the Power Y function 5 6 ni com Chapter 5 Image Processing Exponen
5. IMAQ Vision Concepts Manual Square Frame In a square frame pixels line up as they normally do Figure 9 6 shows a pixel in a square frame surrounded by its eight neighbors If d is the distance from the vertical and horizontal neighbors to the central pixel then the diagonal neighbors are at a distance of 2d from the central pixel Figure 9 6 Square Frame Hexagonal Frame In a hexagonal frame the even lines of an image shift half a pixel to the right Therefore the hexagonal frame places the pixels in a configuration similar to a true circle Figure 9 7 shows a pixel in a hexagonal frame surrounded by its six neighbors Each neighbor is an equal distance d from the central pixel which results in highly precise morphological measurements 9 6 ni com Connectivity Chapter 9 Binary Morphology d d d d Figure 9 7 Hexagonal Frame When to Use After you identify the pixels belonging to a specified intensity threshold IMAQ Vision groups them into particles This grouping process introduces the concept of connectivity You can set the pixel connectivity in some functions to specify how IMAQ Vision determines whether two adjoining pixels are included in the same particle Use connectivity 4 when you want IMAQ Vision to consider pixels to be part of the same particle only when the pixels touch along an adjacent edge Use connectivity 8 when you want IMAQ Vis
6. IMAQ Vision Concepts Manual 5 20 ni com Chapter 5 Image Processing A Laplacian filter highlights contours to produce the following image Kernel Definition The Laplacian convolution filter is a second order derivative and its kernel uses the following model na Qa a goo where a b c and d are integers The Laplacian filter has two different effects depending on whether the central coefficient x is equal to or greater than the sum of the absolute values of the outer coefficients Contour Extraction and Highlighting If the central coefficient is equal to this sum x 2 a b lc d the Laplacian filter extracts the pixels where significant variations of light intensity are found The presence of sharp edges boundaries between objects modification in the texture of a background noise or other effects can cause these variations The transformed image contains white contours on a black background National Instruments Corporation 5 21 IMAQ Vision Concepts Manual Chapter 5 Image Processing Notice the following source image Laplacian kernel and filtered image Source Image Laplacian 3 Filtered Image 1 1 1 1 8 1 1 1 1 If the central coefficient is greater than the sum of the outer coefficients x gt 2 a b c d the Laplacian filter detects the same variations as mentioned above but superimposes them over the source image The transformed image loo
7. 0 9 Figure 1 10 CIE Chromaticity Diagram The three color components R G and B define a triangle inside the CIE diagram of Figure 1 10 Any color within the triangle can be formed by mixing R G and B The triangle is called a gamut Because the gamut 1s only a subset of the CIE color space combinations of R G and B cannot generate all visible colors O National Instruments Corporation 1 23 IMAQ Vision Concepts Manual Chapter 1 Digital Images IMAQ Vision Concepts Manual To transform values back to the RGB space from the CIE XYZ space use the following matrix operation R 3 240479 1 537150 0 498535 X G 0 969256 1 875992 0 041556 Y B 0 055648 0 204043 1 057311 Z Notice that the transform matrix has negative coefficients Therefore some XYZ color may transform to R G B values that are negative or greater than one This means that not all visible colors can be produced using the RGB color space RGB and CIE L a b To transform RGB to CIE L a b you first must transform the RGB values into the CIE XYZ space Use the following equations to convert the CIE XYZ values into the CIE L a b values L 116 x Y Yn 16 for Y Yn gt 0 008856 L 903 3 x Y Yn otherwise a 500 f X Xn f Yn b 200 f Yn f Z Zn where fd t for t gt 0 008856 Ko 7 787t 16 116 otherwise Here Xn
8. International unit of length used for sea and air navigation equal to 6 076 115 feet or 1 852 meters See also mile A pixel whose value affects the value of a nearby pixel when an image is processed The neighbors of a pixel are usually defined by a kernel or a structuring element Operations on a point in an image that take into consideration the values of the pixels neighboring that point Driver software for National Instruments IMAQ hardware Replaces each pixel value with a nonlinear function of its surrounding pixels A highpass edge extraction filter that favors vertical edges IMAQ Vision Concepts Manual Glossary nonlinear Prewitt filter nonlinear Sobel filter Nth order filter number of planes in an image 0 offset opening operators optical character verification optical representation outer gradient P palette particle IMAQ Vision Concepts Manual A highpass edge extraction filter based on two dimensional gradient information A highpass edge extraction filter based on two dimensional gradient information The filter has a smoothing effect that reduces noise enhancements caused by gradient operators Filters an image using a nonlinear filter This filter orders or classifies the pixel values surrounding the pixel being processed The pixel being processed is set to the Nth pixel value where N is the order of the filter The number of arrays of pixels that compose the imag
9. Edge detection finds edges along a line of pixels in the image Use the edge detection tools to identify and locate discontinuities in the pixel intensities of an image The discontinuities are typically associated with abrupt changes in pixel intensity values that characterize the boundaries of objects in a scene To use the edge detection tools in IMAQ Vision first specify a search region in the image You can specify the search path interactively or programmatically When specified interactively you can use one of the line ROI tools to select the search path you want to analyze You also can programmatically fix the search regions based either on constant values or the result of a previous processing step For example you may want to locate edges along a specific portion of a part that has been previously located using particle analysis or pattern matching algorithms The edge detection software analyzes the pixels along this region to detect edges You can configure the edge detection tool to find all edges find the first edge or find the first and last edges in the region Edge detection is an effective tool for many machine vision applications Edge detection provides your application with information about the location of the boundaries of objects and the presence of discontinuities Use edge detection in the following three applications areas gauging detection and alignment O National Instruments Corporation 11 1 IMAQ Vision
10. F u v x ejotu v where F u v is the magnitude and u v is the phase The magnitude of F u v is also called the Fourier spectrum and is equal to PD A RGe v Iu v The Fourier spectrum to the power of two is known as the power spectrum or spectral density The phase u v is also called the phase angle and is equal to I u v at p u v atan ao By default when you display a complex image the magnitude plane of the complex image is displayed using the optical representation To visualize the magnitude values properly the magnitude values are scaled by the factor m before they are displayed The factor m is calculated as 128 wxh where w is the width of the image and h is the height of the image 7 12 ni com Part Ill Particle Analysis This section describes conceptual information about particle analysis including thresholding morphology and particle measurements Part III Particle Analysis contains the following chapters Chapter 8 Thresholding contains information about thresholding and color thresholding Chapter 9 Binary Morphology contains information about structuring elements connectivity and primary and advanced morphological transformations Chapter 10 Particle Measurements contains information about characterizing digital particles Introduction You can use particle analysis to detect connected regions or groupings of pixels in an image
11. IMAQ Vision Concepts Manual Transforms the gray level values of the pixels of an image to occupy the entire range 0 to 255 in an 8 bit image of the histogram increasing the contrast of the image Finds the photometric negative of an image The histogram of a reversed image is equal to the original histogram flipped horizontally around the center of the histogram In LabVIEW a histogram that can be wired directly into a graph Locates objects in the image similar to the pattern defined in the structuring element Fills all holes in objects that are present in a binary image Color encoding scheme in Hue Saturation and Intensity Color encoding scheme using Hue Saturation and Luminance information where each image in the pixel is encoded using 32 bits 8 bits for hue 8 bits for saturation 8 bits for luminance and 8 unused bits Color encoding scheme in Hue Saturation and Value Represents the dominant color of a pixel The hue function is a continuous function that covers all the possible colors generated using the R G and B primaries See also RGB Input output The transfer of data to from a computer system involving communications channels operator interface devices and or data acquisition and control interfaces A two dimensional light intensity function f x y where x and y denote spatial coordinates and the value f at any point x y is proportional to the brightness at that point A user defined regio
12. 0 1 0 O 1 0 1 1 1 1 1 1 4 1 2 1 0 1 0 O 1 0 1 1 1 1 1 2 1 1 4 1 1 2 4 2 4 16 4 1 1 2 1 1 4 1 Table A 12 Gaussian 5 x 5 4 8 16 8 4 NANA N PO N POD NANA Table A 13 Gaussian 7 x 7 Ra NONN KF eS PFNNABRNNK NN Y40 420Nw0w p AoA N NNA ORANRN N A NNANNEe Rh N NNa me National Instruments Corporation A 7 IMAQ Vision Concepts Manual Technical Support and Professional Services Visit the following sections of the National Instruments Web site at ni com for technical support and professional services e Support Online technical support resources include the following Self Help Resources For immediate answers and solutions visit our extensive library of technical support resources available in English Japanese and Spanish at ni com support These resources are available for most products at no cost to registered users and include software drivers and updates a KnowledgeBase product manuals step by step troubleshooting wizards conformity documentation example code tutorials and application notes instrument drivers discussion forums a measurement glossary and so on Assisted Support Options Contact NI engineers and other measurement and automation professionals by visiting ni com support Our online system helps you define your question and connects you to the experts by phone discussion forum or email e Training Visit ni com training for self paced tutorials vi
13. 9 26 ni com Chapter 9 Binary Morphology B Note The segmentation function is time consuming Reduce the image to its minimum significant size before selecting this function In Figure 9 26 binary particles which are shown in black are superimposed on top of the segments which are shown in gray shades Figure 9 26 Segmentation Function When applied to an image with binary particles the transformed image turns completely red because it is entirely composed of pixels set to 1 Comparisons Between Segmentation and Skiz Functions The segmentation function extracts segments that each contain one particle and represent the area in which this particle can be moved without intercepting another particle assuming that all particles move at the same speed The edges of these segments give a representation of the external skeletons of the particles Unlike the skiz function segmentation does not involve median distances You can obtain segments using successive dilations of particles until they touch each other and cover the entire image The final image contains as many segments as there were particles in the original image However if you consider the inside of closed skiz lines as segments you may produce National Instruments Corporation 9 27 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology more segments than particles originally present in the image Consider the upper right region in Figure 9 27 This image sho
14. Skiz is an inverse skeleton L Skeleton on an inverse image L Skeleton Function The L skeleton function indicates the L shaped structuring element skeleton function Notice the original image in Figure 9 23a and how the L skeleton function produces the rectangle pixel frame image in Figure 9 23b x lt Pia a b Figure 9 23 L Skeleton Function National Instruments Corporation 9 25 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology IMAQ Vision Concepts Manual M Skeleton Function The M skeleton function extracts a skeleton with more dendrites or branches Using the same original image from Figure 9 23a the M skeleton function produces the image shown in Figure 9 24 Figure 9 24 M Skeleton Function Skiz Function The skiz skeleton of influence zones function behaves like an L skeleton applied to the background regions instead of the particle regions It produces median lines that are at an equal distance from the particles Using the source image from Figure 9 23a the skiz function produces the following image which is shown superimposed on the source image Figure 9 25 Skiz Function Segmentation Function The segmentation function is applied only to labeled images It partitions an image into segments that are centered on an particle such that they do not overlap each other or leave empty zones Empty zones are caused by dilating particles until they touch one another
15. Decreases brightness and increases contrast in bright regions of an image and decreases contrast in dark regions of an image Fast Fourier Transform A method used to compute the Fourier transform of an image A reference pattern on a part that helps a machine vision application find the part s location and orientation in an image Window or area on the screen on which you place controls and indicators to create the user interface for your program The magnitude information of the Fourier transform of an image Transforms an image from the spatial domain to the frequency domain Counterparts of spatial filters in the frequency domain For images frequency information is in the form of spatial frequency Feet A set of software instructions executed by a single line of code that may have input and or output parameters and returns a value when executed The nonlinear change in the difference between the video signal s brightness level and the voltage level needed to produce that brightness Measurement of an object or distances between objects G 6 ni com Gaussian filter gradient convolution filter gradient filter gray level gray level dilation gray level erosion grayscale image grayscale morphology H h highpass attenuation highpass FFT filter highpass filter highpass frequency filter highpass truncation histogram National Instruments Corporation G 7 Glossary A filter similar to
16. G B triple The diagonal line of the cube from black 0 0 0 to white 1 1 1 represents all the grayscale values or where all of the red green and blue components are equal Different computer hardware and software combinations use different ranges for the colors Common combinations are 0 255 and 0 65 535 for each component To map color values within these ranges to values in the RGB cube divide the color values by the maximum value that the range can take O National Instruments Corporation 1 15 IMAQ Vision Concepts Manual Chapter 1 Digital Images B Blue 0 0 1 Cyan Magenta 71 White o TER A 0 1 1 ac gt y gt p G Green 1 0 0 La Je Red Yellow R Figure 1 7 RGB Cube The RGB color space lies within the perceptual space of humans In other words the RGB cube represents fewer colors than we can see The RGB space simplifies the design of computer monitors but it is not ideal for all applications In the RGB color space the red green and blue color components are all necessary to describe a color Therefore RGB is not as intuitive as other color spaces The HSL color space describes color using only the hue component which makes HSL the best choice for many image processing applications such as color matching IMAQ Vision Concepts Manual 1 16 ni com Chapter 1 Digital Images HSL Color Space The HSL color space was developed to put color in terms
17. NAND b a 0 1 a 1 1 NOR b b a 0 1 0 a 1 0 0 ni com Chapter 6 Operators XOR XNOR b b b 0 a 0 0 1 a 0 1 0 a 1 1 0 a 1 0 1 NOT NOT a a 0 1 a 1 0 Example 1 The following figure shows the source grayscale image used in this example Regions of interest have been isolated in a binary format retouched with morphological manipulations and finally multiplied by 255 to obtain the following image mask National Instruments Corporation 6 5 IMAQ Vision Concepts Manual Chapter 6 Operators The source image AND mask image operation restores the original intensity of the object regions in the mask The source image OR mask image operation restores the original intensity of the background region in the mask Example 2 This example demonstrates the use of the OR operation to produce an image containing the union of two binary images The following image represents the first image with a background of 0 and objects with a gray level value of 128 IMAQ Vision Concepts Manual 6 6 ni com Chapter 6 Operators The following figure shows the second image featuring a background of 0 and objects with gray level values of 255 O National Instruments Corporation 6 7 IMAQ Vision Concepts Manual Frequency Domain Analysis This chapter contains information about converting images into the frequency domain using the
18. Pattern matching can find patterns that have undergone some transformation because of blurring or noise Blurring usually occurs because of incorrect focus or depth of field changes Figure 12 4 illustrates typical blurring and noise conditions under which pattern matching works correctly Figure 12 4a shows the original template image Figure 12 4b shows changes on the image caused by blurring Figure 12 4c shows changes on the image caused by noise o 00 Figure 12 4 Examples of Blur and Noise Pattern Matching Techniques IMAQ Vision Concepts Manual Pattern matching includes traditional techniques newer techniques and other techniques such as particle analysis Traditional Pattern Matching Traditional pattern matching techniques include normalized cross correlation pyramidal matching and scale invariant matching 12 4 ni com Chapter 12 Pattern Matching Cross Correlation Normalized cross correlation is the most common way to find a template in an image Because the underlying mechanism for correlation is based on a series of multiplication operations the correlation process is time consuming Technologies such as MMX allow you to do parallel multiplications and reduce overall computation time You can speed up the matching process by reducing the size of the image and restricting the region in the image where the matching is done However the basic normalized cross correlation operation does not meet th
19. The smallest feature size on your object that the imaging system can distinguish e Pixel resolution The minimum number of pixels needed to represent the object under inspection e Field of view The area of the object under inspection that the camera can acquire Working distance The distance from the front of the camera lens to the object under inspection e Sensor size The size of a sensor s active area typically defined by the sensor s horizontal dimension Depth of field The maximum object depth that remains in focus For additional information about the fundamental parameters of an imaging system refer to the Application Notes sections of the Edmund Industrial Optics Optics and Optical Instruments Catalog or visit Edmund Industrial Optics at edmundoptics com Acquiring Quality Images The manner in which you set up your system depends on the type of analysis and processing you need to do Your imaging system should produce images with high enough quality so that you can extract the information you need from the images Five factors contribute to overall image quality resolution contrast depth of field perspective and distortion Resolution There are two kinds of resolution to consider when setting up your imaging system pixel resolution and resolution Pixel resolution refers to the minimum number of pixels you need to represent the object under inspection You can determine the pixel resolution you nee
20. When to Use The line profile utility is helpful for examining boundaries between components quantifying the magnitude of intensity variations and detecting the presence of repetitive patterns Figure 4 5 illustrates a typical line profile National Instruments Corporation 4 5 IMAQ Vision Concepts Manual Chapter 4 Image Analysis Intensity Max Brighter Intensity Min Darker Starting Point Ending Point Figure 4 5 Line Profile The peaks and valleys represent increases and decreases of the light intensity along the line selected in the image Their width and magnitude are proportional to the size and intensity of their related regions For example a bright object with uniform intensity appears in the plot as a plateau The higher the contrast between an object and its surrounding background the steeper the slopes of the plateau Noisy pixels on the other hand produce a series of narrow peaks Intensity Measurements When to Use Concepts IMAQ Vision Concepts Manual Intensity measurements measure the grayscale image statistics in an image or regions in an image Y ou can use intensity measurements to measure the average intensity value in aregion of the image to determine for example the presence or absence of a part or a defect in a part Densitometry IMAQ Vision contains the following densitometry parameters e Minimum Gray Value Minimum intensity value in gray level units e Maximum
21. an inverse FFT produces an image in which noise details texture and sharp edges are smoothed A lowpass frequency filter removes or attenuates spatial frequencies located outside a frequency range centered on the fundamental or null frequency Lowpass Attenuation Lowpass attenuation applies a linear attenuation to the full frequency range increasing from the null frequency f to the maximum frequency fax This is done by multiplying each frequency by a coefficient C which is a function of its deviation from the fundamental and maximum frequencies Fins P cy ra Irs fo where C fo 1 and C finax 0 q C t 0 fo fmax Lowpass Truncation Lowpass truncation removes a frequency f if it is higher than the cutoff or truncation frequency f This is done by multiplying each frequency f by a 7 6 ni com Chapter 7 Frequency Domain Analysis coefficient C equal to 0 or 1 depending on whether the frequency fis greater than the truncation frequency fe If F gt f then Cif 0 else cif 1 cep th fo fmax The following series of graphics illustrates the behavior of both types of lowpass filters They give the 3D view profile of the magnitude of the FFT This example uses the following original FFT After lowpass attenuation the magnitude of the central peak is the same and variations at the edges almost have disappeared National Instruments Corporation 7 7 IMAQ
22. contrast of some regions of the image by finding the minimum and maximum values of those regions and computing the histogram of those regions A histogram of this region shows the minimum and maximum intensities of the pixels Those values are used to stretch the dynamic range of the entire image Downshifts This technique is based on shifts of the pixel values This method applies a given number of right shifts to the 16 bit pixel value and displays the least significant bit This technique truncates some of the lowest bits which are not displayed This method is very fast but it reduces the real dynamic of the sensor to 8 bit sensor capabilities It requires knowledge of the bit depth of the imaging sensor that has been used For example an image acquired with a 12 bit camera 2 3 IMAQ Vision Concepts Manual Chapter 2 Display Palettes should be visualized using 4 right shifts in order to display the 8 most significant bits acquired with the camera If you are using an IMAQ image acquisition device this technique is the default used by Measurement amp Automation Explorer MAX When to Use Concepts IMAQ Vision Concepts Manual At the time a grayscale image is displayed on the screen IMAQ Vision converts the value of each pixel of the image into red green and blue intensities for the corresponding pixel displayed on the screen This process uses a color table called a palette which associates a color to each possible
23. grayscale value of an image IMAQ Vision provides the capability to customize the palette used to display an 8 bit grayscale image With palettes you can produce different visual representations of an image without altering the pixel data Palettes can generate effects such as photonegative displays or color coded displays In the latter case palettes are useful for detailing particular image constituents in which the total number of colors is limited Displaying images in different palettes helps emphasize regions with particular intensities identify smooth or abrupt gray level variations and convey details that might be difficult to perceive in a grayscale image For example the human eye is much more sensitive to small intensity variations in a bright area than in a dark area Using a color palette may help you distinguish these slight changes A palette is a pre defined or user defined array of RGB values It defines for each possible gray level value a corresponding color value to render the pixel The gray level value of a pixel acts as an address that is indexed into the table returning three values corresponding to a red green and blue RGB intensity This set of RGB values defines a palette in which varying amounts of red green and blue are mixed to produce a color representation of the value range In the case of 8 bit grayscale images pixels can take 28 or 256 values ranging from 0 to 255 Color palettes are composed of 2
24. image Convex Hull Area Area of the particle Ac Convex Hull Image Area Area of the image Ay Image Area Figure 10 3a shows an image of a calibration grid The image exhibits nonlinear distortion Figure 10 3b shows an image of coins taken with the same camera setup as Figure 10 3a The dashed line around Figure 10 3b defines the image area in pixels Figure 10 3c illustrates the image of coins after image correction The dashed line around Figure 10 3c defines the image area in calibrated units National Instruments Corporation 10 13 IMAQ Vision Concepts Manual Chapter 10 Particle Measurements ARA sosote cr rcnn o i Es c eee eeeeeeeee k d sooo oooooooooooo q IN isorcccrrc eee eco oce re o i A seccoooooooooooooo recoooon e e o d i ee ee enn ie i ere e 00000000 i teses o00000000 O a b c Figure 10 3 Image Area in Pixels and Calibrated Units Quantities Table 10 5 lists the IMAQ Vision particle quantity measurements Table 10 5 Quantities Measurement Definition Symbol Number of Holes Number of holes in the particle Number of Horiz Number of horizontal segments in the Sy Segments particle Number of Horiz Segments is always given as a pixel measurement Number of Vert Number of horizontal segments in the Sy Segments particle Number of Vert Segments is always given a
25. image you can perform frequency domain operations on the image Each pixel in a complex image is encoded as two single precision floating point values which represent the real and imaginary components of the complex pixel You can extract the following four components from a complex image the real part imaginary part magnitude and phase Image Files An image file is composed of a header followed by pixel values Depending on the file format the header contains image information about the horizontal and vertical resolution pixel definition and the original palette Image files may also store information about calibration pattern matching templates and overlays The following are common image file formats Bitmap BMP Tagged image file format TIFF e Portable network graphics PNG Offers the capability of storing image information about spatial calibration pattern matching templates and overlays National Instruments Corporation 1 5 IMAQ Vision Concepts Manual Chapter 1 Digital Images e Joint Photographic Experts Group format JPEG e National Instruments internal image file format AJPD Used for saving floating point complex and HSL images Standard formats for 8 bit grayscale and RGB color images are BMP TIFF PNG JPEG and AIPD Standard formats for 16 bit grayscale and complex images are PNG and AIPD Internal Representation of an IMAQ Vision Image IMAQ Vision Concepts Manual Figure 1
26. of the image Because the minimum and maximum pixel values in an image are used to determine the full dynamic range of that image the presence of noisy or defective pixels for non Class A sensors with minimum or maximum values can affect the appearance of the displayed image IMAQ Vision uses the following technique by default z 2x255 v y where z is the 8 bit pixel value O National Instruments Corporation x is the 16 bit value y is the minimum intensity value v is the maximum intensity value 90 Dynamic The intensity corresponding to 5 of the cumulative histogram is mapped to 0 the intensity corresponding to 95 of the cumulated histogram is mapped to 255 Values in the 0 to 5 range are mapped to 0 while values in the 95 to 100 range are mapped to 255 This mapping method is more robust than the full dynamic method and is not sensitive to small aberrations in the image This method requires the computation of the cumulative histogram or an estimate of the histogram Refer to Chapter 4 Image Analysis for more information on histograms Given Percent Range This method is similar to the 90 Dynamic method except that the minimum and maximum percentages of the cumulative histogram that the software maps to 8 bit are user defined Given Range This technique is similar to the Full Dynamic method except that the minimum and maximum values to be mapped to 0 and 255 are user defined You can use this method to enhance the
27. oooonncnnncnonnonnnonconnnoncononnnonncnnncnncnn cc nonnnono 9 3 Pixel Frame Shape rei A ias 9 4 CONNECUVI sti di A a sti 9 7 Whenito Us Sucia is dis 9 7 Connectivity Concepisi iii at 9 7 In Depth Discussion inicias 9 8 Connectivity A cms 9 8 Connectviy Err eet eae be Re daba 9 9 Primary Morphology Operations cc eceecesseeccesecseceseeseesseeeeeseeseeeseeaeeneecseseseeseenaes 9 9 When to Use uc oda 9 9 Primary Morphology Concepts cooonccnccnonnnonconnonnconcnanonnnnnnonnnonnono nan cano rnncnncrnno 9 10 Erosion and Dilation Functions oooooccnonnoccnoncnncnncnnonnninncnncnnncnncnnnnno 9 10 Opening and Closing Functions 000 0 cece eee eeeceeeeeeeseeeeeseeseees 9 13 Inner Gradient Function eee eee ceceseeeceeesseeeseeseeeseeseenseeaeees 9 14 Outer Gradient Function cece essences ceeeeseseeeeseeseeeseeseeeaeees 9 14 Hit Miss FUNCION uo 9 14 PRIMNMIN ES RUNCHON keneson E EO EAE 9 17 Thickening PUNCHOM ci ide 9 18 Proper Opening Function eioi r a RN 9 20 Proper Closing Function ssessesessesesessseeresesrertsreereresresreresresesresrsee 9 20 Auto Median FUNCION piine e a E Oa 9 21 Advanced Morphology Operations eesssesseeesseeestsresrsresrestsrestsrtssestsresrerrsresterrsrrsresest 9 21 When ito Use tata tt its 9 21 Advanced Morphology Transforms Concepts oooocoocnnoccoccnononnnonacononnnonacanccnnon 9 22 Border Finca 9 22 Hole Filling FUnction ee eee eeceseceeceeeeceesecseeeseeseesaeeseeeaeeseees
28. the more templates can be edited and the more selective the effect Table 9 2 How the Structure Element Affects Erosion Structuring Element After Erosion Description af A pixel is cleared if it is equal to 1 and if its three upper left neighbors do not equal 1 The erosion truncates the upper left particle borders A pixel is cleared if it is equal to 1 and if its lower and right neighbors do not equal 1 The erosion truncates the bottom and right particle borders but retains the corners P Table 9 3 How the Structure Element Affects Dilation Structuring Element After Dilation Description af A pixel is set to 1 if it is equal to 1 or if one of its three upper left neighbors equals 1 The dilation expands the lower right particle borders A pixel is set to 1 if it is equal to 1 or if it its lower or right neighbor equals 1 The dilation expands the upper and left particle borders P IMAQ Vision Concepts Manual 9 12 ni com Chapter 9 Binary Morphology Opening and Closing Functions The opening function is an erosion followed by a dilation This function removes small particles and smooths boundaries This operation does not significantly alter the area and shape of particles because erosion and dilation are dual transformations in which borders removed by the erosion process are restored during dilation However small particles eliminated during the er
29. 10 In Depth Discussion ocococccooccnoccnoncconnononnconnconnnconccnnnnconnconnn S aie aeiia iia 7 11 Fourier Transforme e eere e a E aa aE R a AA EE a 7 11 FET Display iou a E Me ete eA ee 7 12 Part Ill Particle Analysis A NN TIr 1 When to Use ri ia did TIT 2 Particle Analysis Concept Sas entrerriana tias TIr 2 Chapter 8 Thresholding Introducir nic 8 1 When to US Evo lo cie di 8 1 Thresholding Concepts oooooccnccnocnnonconononcnnnonccononnncnno aak con nc nn nonnannss 8 2 Intensity Threshold coccion iria 8 2 Thresholding Examples sseni iraina 8 2 Automatic Threshold issen ai AE R E 8 3 In Depth Discussions isiin iirin a aa a ia 8 6 Automatic Thresholding Techniques ooconcninnnoninnnnoncnnnonnnoncnanananonnos 8 6 Cluster oia aii 8 7 EDITOPYiitsciotci pa ala io N E 8 7 Inter Variaciones 8 8 Metro 8 8 A cevis tensed hens clecds capscndte nats coasheadalbais saiteuan sees cuutavsnt caboges ate 8 9 Color Thresh ldihg zinian n a E a a ae 8 9 Whemto USE vincia 8 9 O National Instruments Corporation ix IMAQ Vision Concepts Manual Contents Chapter 9 Binary Morphology Int dC HOM A i 9 1 Structuring Elements seraa e rT E T Sepdee dieses ede eee eat 9 1 Whenito Uses sil es a i i Sela ees 9 1 Structuring Elements Concepts coconoccoccnoccnnconononcnnnonncononnnonnonnnonn cnn ncononn con nnnnnnns 9 2 Structuring Element Size oooococcnoccnononnconononconnnnnonnconnonnonnnonncnnccnncnnnono 9 2 Structuring Element Values
30. 14 14 14 13 11 11 13 11 12 9 16 17 11 13 14 14 13 12 12 12 8 12 14 12 13 12 14 11 13 13 13 8 12 12 8 12 14 12 13 12 14 11 13 13 11 10 10 10 9 13 31 30 32 33 12 13 11 11 11 9 10 10 9 13 31 30 32 33 12 13 11 11 13 15 15 15 11 10 30 42 45 31 15 12 10 10 10 11115 15 11 10 30 42 45 31 15 12 10 10 12 13 13 13 12 14 29 40 41 33 13 12 13 13 13 12 113 13 12 14 29 40 41 33 13 12 13 13 12 14 114 14 15 12 33 34 36 32 12 14 11 11 11 15 114 14 15 12 33 34 36 32 12 14 11 11 14 10 10 10 12 13 14 12 16 12 15 10 9 9 9 12 10 10 12 13 14 12 16 12 15 10 9 9 10 10 10 10 8 11 13 15 17 13 14 12 10 10 10 8 10 10 8 11 13 15 17 13 14 12 10 10 12 9 9 10 12 11 8 15 14 12 11 7 7 7 10 9 9 10 12 11 8 15 14 12 11 7 7 11 SS oT T 8 aS tA 2AN 7 Zz 10 9 Da Ko p eA eE A a e e E S i A e Te a a 7 11 9y9 9 p10 12 111 8 15 14 12 111 7 7 7 8 110 10 8 11 13 15 17 13 14 12 10 10 12 C d Figure 1 3 Setting the Pixel Values of an Image Border The method you use to fill the border pixels depends on the processing function you require for your application Review how the function works before choosing a border filling method because your choice can drastically affect the processing results For example if you are using a function that detects edges in an image based on the difference between a pixel and its neighbors do not set the border pixel values to zero As shown in Figure 1 3b an image border containing zero values introd
31. 5 3 Logarithmic and Inverse Gamma CorrectiON oooncnnccnocnnoncnnnnnnconannno 5 4 Exponential and Gamma CorrectiON ooonconncnocnnonconnnoncnnnonaconncnncnnconnos 5 6 EquallZO iia ei 5 8 Convolution Kernels encolado ista 5 10 Concepts ui ad retiren 5 10 O National Instruments Corporation vij IMAQ Vision Concepts Manual Contents Spatial Filtering oi id rei caian dae it When to Use cuisine dolina it Spatial Filtering CONCEOPES essensies ai ieseana Spatial Filter Classification Summary ooonconocnnoncnnonanonaconanononnncnnnnnn no LinearBllterS iia Nonlinear Filters 0 Ada isis In Depth DISCUSSION iia Teee ES Linear Elton Nonlinear Prewitt Filter dided ASS Nonlinear Sobel Filter oooooccnnnonccnccnnonannnccononcnoncconananoncconanancnoconos Nonlinear Gradient Filter cconooononnnnnonocancnononcnnncnnnanananoconanannnononos Roberts Filters ui rial a N Differentiation Filtet in cc255 s okies Ai Bee Re es Sigma Filters sso EEE Ales Sassen secs exe shes casdasva sedges dadsssuscazerviedee Low pass Filt r sm srn des NN Median Filter pnn isla daa tit enh Nth Order Filter ai Grayscale Morphology sisi ash eta iii When to Us Lidl titi Grayscale Morphology Concepts oooooccnoccocononcnnonncononnnonnnonccnnonononn conc cano rnncnncnns Erosion Functio Nse eat Dilation FU Coin Erosion and Dilation Examples ooooonoccnococoncconccnnoncnnnconccnoncconccnnncnanoss Opening Finca it Closing EUNCUOO tii rea
32. 9 22 Labeling Functions r a e oe a Eee E E RE TAE ENA ENS 9 23 Lowpass and Highpass Filters ooooncnnnnncnnnnnocnonnnancnnnrnncanonnncnnnnn nono 9 23 Separation F ncti citan iros 9 24 Skeleton FUNCTIONS 00 0 e raaa nea e eaan a a a eaaa e at 9 25 Segmentation FUNCOM csser teesi eein ii oeei poea u ri eerie eraai 9 26 Distance Function ici 9 28 Danielsson Function eri 9 28 Circle FUNCION ida 9 29 Convex Hull Function tostada 9 30 IMAQ Vision Concepts Manual X ni com Contents Chapter 10 Particle Measurements AA O 10 1 When to Use cee nr a RE As 10 1 Pixel versus Real World Measurements ooccooocccooonccooncnononcnononcnconnnncnnncnnnonnnnos 10 1 Particle MEasUreMEntS Weil diia ea rra Ea EAS 10 2 Particle Concepts erinra Mae ha A A ee ad E 10 3 Particle Hole gt 10 5 Particle Measurement Definitions oooocnnonccnonononononononnnnnonnnncnoncncnnnoncnonnnnnnnnnos 10 6 COMME tac ada des 10 7 Length tia ati 10 9 AT A Aes Seiad ee eran tee Be 10 13 Quantities iia did din dada 10 14 Anel A da iaa ds 10 14 Rat ida 10 16 A O E 10 16 A ededeeavescloddns adsieetostetaceesiievehedensl T 10 17 MO Me nits A ri 10 18 Part IV Machine Vision Chapter 11 Edge Detection Introduction 2 253 atsedecheth E Sonesta ea 11 1 Whemto Usen ti 11 1 GAU SING dai 11 2 Detection a Sates Ae Se Se 11 2 ATI STIMON iii thecenvegieadecta EAE EEA A 11 3 Edge Detection Concepts isc dior a a N N A ces dais 11 4 Definition of an E
33. CIE L a b color space 1 19 CMY color space 1 19 color sensations 1 14 common types of color spaces 1 13 definition 1 13 generating color spectrum 14 1 HSL color space 1 17 RGB color space 1 15 transformations RGB and CIE L a b 1 24 RGB and CIE XYZ 1 21 RGB and CMY 1 25 RGB and HSL 1 20 RGB and YIQ 1 25 RGB to grayscale 1 20 when to use 1 13 YIQ color space 1 19 color spectrum generating 14 3 HSL color space 14 1 overview 14 1 IMAQ Vision Concepts Manual Index color thresholding ranges HSL image figure 8 11 RGB image figure 8 10 when to use 8 9 comparison operators See logic and comparison operators complex images definition 1 5 Concentric Rake function 11 12 connectivity basic concepts and examples 9 7 Connectivity 4 9 8 Connectivity 8 9 9 in depth discussion 9 8 when to use 9 7 contacting National Instruments B 1 contours extracting and highlighting 5 21 thickness 5 23 contrast histogram for determining lack of contrast 4 2 setting 3 5 conventions used in the manual xv convex hull function binary morphology 9 30 convex hull digital particles 10 4 convolution definition 5 13 types of families 5 13 convolution kernels See also linear filters basic concepts 5 10 examples of kernels figure 5 11 filtering border pixels figure 5 13 mechanics of filtering figure 5 11 size of 5 13 when to use 5 10 IMAQ Vision Concepts Manual coordinate system d
34. Concepts Manual Chapter 11 Edge Detection IMAQ Vision Concepts Manual Gauging Use gauging to make critical dimensional measurements such as lengths distances diameters angles and counts to determine if the product under inspection is manufactured correctly The component or part is either classified or rejected depending on whether the gauged parameters fall inside or outside of the user defined tolerance limits Gauging is often used both inline and offline in production During inline processes each component is inspected as it is manufactured Visual inline gauging inspection is a widely used inspection technique in applications such as mechanical assembly verification electronic packaging inspection container inspection glass vial inspection and electronic connector inspection Gauging applications also often measure the quality of products offline A sample of products is extracted from the production line Next measured distances between features on the object are studied to determine if the sample falls within a tolerance range You can measure the distances separating the different edges located in an image as well as positions measured using particle analysis or pattern matching techniques Edges can also be combined in order to derive best fit lines projections intersections and angles You also can use edge locations can to compute estimations of shape measurements such as circles ellipses polygons and so on
35. Ellipse t 6 Poze p Axis 2T 2T Equivalent Length of the minor axis of the E P 24A P 24 Ellipse Minor Equivalent Ellipse T PR w Axis 21 21 Equivalent Length of the minor axis of the EF 4A cy Ellipse Minor ellipse with the same area as the n F National Instruments Corporation 10 9 IMAQ Vision Concepts Manual Chapter 10 Particle Measurements Table 10 3 Lengths Continued Measurement Definition Symbol Equation Equivalent Rect Longest side of the Equivalent Ra 1 P P16 A Long Side Rect 4 Equivalent Rect Shortest side of the Equivalent R 1 p2 Short Side Rect ge TA Equivalent Rect Distance between opposite Rg R Ps R Diagonal corners of the Equivalent Rect Equivalent Rect Shortest side of the rectangle with RF Acuy Short Side the same area as the particle and F Feret longest side equal in length to the Max Feret Diameter Average Horiz Average length of a horizontal A Segment Length segment in the particle Sum of Sy the horizontal segments that do not superimpose any other horizontal segment Average Horiz Segment Length is always given as a pixel measurement Average Vert Average length of a vertical A Segment Length segment in the particle Sum of S the vertical segments that do not superimpose any other vertical segment Average Vert Segment Length is always given as a pixel measurement Hydraulic The particle area divided b
36. Fast Fourier transform and information about analyzing and processing images in the frequency domain Introduction Frequency filters alter pixel values with respect to the periodicity and spatial distribution of the variations in light intensity in the image Unlike spatial filters frequency filters do not apply directly to a spatial image but to its frequency representation The frequency representation of an image is obtained through the Fast Fourier transform FFT function which reveals information about the periodicity and dispersion of the patterns found in the source image You can filter the spatial frequencies seen in an FFT image The inverse FFT function then restores a spatial representation of the filtered FFT image FFT Filter Inverse FFT xy A Flu v Hu vy gt g x y Frequency processing 1s another technique for extracting information from an image Instead of using the location and direction of light intensity variations you can use frequency processing to manipulate the frequency of the occurrence of these variations in the spatial domain This new component is called the spatial frequency which is the frequency with which the light intensity in an image varies as a function of spatial coordinates Spatial frequencies of an image are computed with the FFT The FFT is calculated in two steps a 1D Fast Fourier transform of the rows followed by a Fast F
37. Figure 11 1 shows how a gauging application uses edge detection to measure the length of the gap in a spark plug Figure 11 1 Gauging Application Using Edge Detection Detection Part present not present applications are typical in electronic connector assembly and mechanical assembly applications The objective of the application is to determine if a part is present or not present using line profiles and edge detection An edge along the line profile is defined by the level of contrast between background and foreground and the slope of the 11 2 ni com Chapter 11 Edge Detection transition Using this technique you can count the number of edges along the line profile and compare the result to an expected number of edges This method offers a less numerically intensive alternative to other image processing methods such as image correlation and pattern matching Figures 11 2 shows a simple detection application in which the number of edges detected along the search line profile determines if a connector has been assembled properly Detection of eight edges indicates that there are four wires Any other edge count means that the part has not been assembled correctly Figure 11 2 Connector Inspection Using Edge Detection You also can use edge detection to detect structural defects such as cracks or cosmetic defects such as scratches on a part If the part is of uniform intensity these defects show up as s
38. Functions Using pattern matching techniques to locate a reference feature is a good alternative to edge detection when you cannot find straight distinct edges in the image The reference feature or template is the basis for the coordinate system The software searches for a template image in a rectangular search area of the reference image The location and orientation of the located template is used to create the reference position of a coordinate system or to update the current location and orientation of an existing coordinate system The same constraints on feature stability and robustness that apply to the edge detection techniques also apply to pattern matching Pattern matching uses one of two strategies shift invariant pattern matching National Instruments Corporation 13 7 IMAQ Vision Concepts Manual Chapter 13 Dimensional Measurements and rotation invariant pattern matching Shift invariant pattern matching locates a template in an ROI or the entire image with a maximum tolerance in rotation of 5 The rotation invariant strategy locates a template in the image even when the template varies in orientation between 0 and 360 For recommendations about the type of patterns to use for a template refer to Chapter 12 Pattern Matching Figure 13 4 illustrates how to locate a coordinate system using a shift invariant pattern matching strategy Figure 13 4a shows a reference image with a defined reference coordinate syste
39. G to the image Auto Median Function The auto median function uses dual combinations of openings and closings It generates simpler particles that contain fewer details If Tis the source image the auto median function extracts the intersection between the proper opening and proper closing of the source image 7 auto median I AND OCO COC or auto median I AND DEEDDE 1 EDDEED 1 where Tis the source image E is an erosion D is a dilation O is an opening C is a closing F I is the image obtained after applying the function F to the image and GF is the image obtained after applying the function F to the image followed by the function G to the image Advanced Morphology Operations The advanced morphology operations are built upon the primary morphological operators and work on particles as opposed to pixels Each of the operations have been developed to perform specific operations on the particles in a binary image When to Use Use the advanced morphological operations to fill holes in particles remove particles that touch the border of the image remove unwanted small and large particles separate touching particles find the convex hull of particles and more O National Instruments Corporation 9 21 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology You can use these transformations to prepare particles for quantitative analysis observe the geometry of regions extract the simple
40. L XOR I hit miss 1 Figure 9 18a shows the binary source image used in the following example of thinning Figure 9 18b illustrates the resulting image in which single pixels in the background are removed from the image This example uses the following structuring element ooo om ooo Figure 9 18 Thinning Function O National Instruments Corporation 9 17 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology Another thinning example uses the source image shown in Figure 9 19a Figures 9 19b through 9 19d show the results of three thinnings applied to the source image Each thinning uses a different structuring element which is specified above each transformed image Gray cells indicate pixels equal to 1 Figure 9 19 Thinning Function with Structuring Elements Thickening Function The thickening function adds to an image those pixels located in a neighborhood that matches a template specified by the structuring element Depending on the configuration of the structuring element you can use thickening to fill holes and smooth right angles along the edges of particles A larger structuring element allows for a more specific template The thickening function extracts the union between a source image and its transformed image which was created by a hit miss function using a structuring element specified for thickening In binary terms the operation adds a hit miss transformation to a source im
41. N 1 pixels These particles are divided into two parts after N 1 2 erosions The above definition is true when N is an odd number but should be modified slightly when N is an even number due to the use of erosions in determining whether a narrowing should be broken or kept The function cannot discriminate a narrowing with a width of 2k pixels from a narrowing with a width of 2k 1 pixels therefore one erosion breaks both a narrowing that is 2 pixels wide as well as a narrowing that is 1 pixel wide 9 24 ni com Chapter 9 Binary Morphology The precision of the separation is limited to the elimination of constrictions that have a width smaller than an even number of pixels e If Nis an even number 2k the separation breaks a narrowing with a width smaller than or equal to 2k 2 pixels It uses k 1 erosions If Nis an odd number 2k 1 the separation breaks a narrowing with a width smaller than or equal to 2k It uses k erosions Skeleton Functions A skeleton function applies a succession of thinnings until the width of each particle becomes equal to 1 pixel The skeleton functions are both time and memory consuming They are based on conditional applications of thinnings and openings that various configurations of structuring elements L Skeleton uses the following type of structuring element 0 1 0 1 1 0 1 M Skeleton uses the following type of structuring element 1 0 1 1 1
42. Opening and Closing Examples oooonconncnicnnncnocnnancnnononcnnnnnncnncnnnonannno Proper Opening Function 0 0 0 eee eeceseeeeeeseceeeseeseeeseeeeeseeatens Proper Closing FunNCtiON oooonocnnocnocnnonconnnnncononnonncnnnonnonnn cnn cnn con ncnncono Auto Median Runcton lt timida In Depth Discussions is sacis ccsecessceeiacivigcescevsscvenseacesviccsdensvea loved cavagees dovstencsnevsasy Chapter 6 Operators Introduction Erosion Concept and Mathematics 00 0 eee eeeeseeeeeeeeseeseeeaeees Dilation Concept and Mathematics ooooocnocnnoncoconocconcnnonanonnnnnccncnnnono Proper Opening Concept and Mathematics oooonncnnninnnicnnonconnnnncnnnnns Proper Closing Concept and Mathematics ooooccoconocconononconcnnncnncannnno Auto Median Concept and Mathematics ooooconocnocnnonconcnnnonncanananonnnns Whento Usina ia ta Operator Concepts sis inie aa iat inet Rel ge Rate Arithmetic OPErators y eag rainen ir Logic and Comparison Operators oocooccnocnoccnoncononnnonncnn conan nnncnncnnccnnono IMAQ Vision Concepts Manual viii ni com Contents Example tica or ania 6 5 Example Lic iii 6 6 Chapter 7 Frequency Domain Analysis A RN 7 1 When to Usei eoa aa 7 2 Fast Fourier Transform Concepts ooooconccnocnnononnconnonncnnnonnconcnononnncononnncncnnconncnnos 7 3 FET Representation occ alan a ade se sheet oes 7 3 Lowpass PRT Bultets lt a bade bees ce rite tasedesces 7 6 Highpass FET Filters iia ta 7 8 Mask FET Filters 02 7
43. Pol Max Pn Max Da po Min Pn min p Pb In the case of images with 8 bit resolution logic operators are mainly designed to do the following Combine gray level images with binary mask images which are composed of pixels equal to 0 or 255 e Combine or compare images with binary or labeled contents Table 6 3 illustrates how logic operators can be used to extract or remove information in an image Table 6 3 Using Logical Operators with Binary Image Masks For a given p If p 255 then If p 0 then AND Pa AND 255 p Pa AND 0 0 NAND Pa NAND 255 NOT p pz NAND 0 255 OR Pa OR 255 255 Pa ORO p NOR Pa NOR 255 0 Pa NOR 0 NOT p National Instruments Corporation 6 3 IMAQ Vision Concepts Manual Chapter 6 Operators IMAQ Vision Concepts Manual Table 6 3 Using Logical Operators with Binary Image Masks Continued For a given Pa If py 255 then If p 0 then XOR Pa XOR 255 NOT p Pa XOR O p Logic Difference Pa NOT 253 p Pa NOTO 0 Truth Tables The following truth tables describe the rules used by the logic operators The top row and left column give the values of input bits The cells in the table give the output value for a given set of two input bits AND b b a 0 0 0 iO OR b b a 0 0 1 a 1 1 1 6 4
44. ROIs refer to the Regions of Interest section of Chapter 2 Display IMAQ Vision Concepts Manual 1 12 ni com Color Spaces Chapter 1 Digital Images When to Use Color spaces allow you to represent a color A color space is a subspace within a three dimensional coordinate system where each color is represented by a point You can use color spaces to facilitate the description of colors between persons machines or software programs Various industries and applications use a number of different color spaces Humans perceive color according to parameters such as brightness hue and intensity while computers perceive color as a combination of red green and blue The printing industry uses cyan magenta and yellow to specify color The following is a list of common color spaces e RGB Based on red green and blue Used by computers to display images HSL Based on hue saturation and luminance Used in image processing applications e ClE Based on brightness hue and colorfulness Defined by the Commission Internationale de l Eclairage International Commission on Illumination as the different sensations of color that the human brain perceives e CMY Based on cyan magenta and yellow Used by the printing industry YIQ Separates the luminance information Y from the color information I and Q Used for TV broadcasting You must define a color space every time you process color images With IMAQ Vi
45. Size f x f Example of a Filter Size 3 x 3 e If N f 1 2 each pixel is replaced by Order 4 1ts local median value Dark pixels isolated in objects are removed as well as bright pixels isolated in the background The overall area of the background and object regions does not change equivalent to a median filter e If N gt f 1 2 the Nth order filter has the Order 8 tendency to dilate bright regions or erode dark regions smooths image dilates bright e IfN f 1 each pixel is replaced by its objects local maximum In Depth Discussion If Pu represents the intensity of the pixel P with the coordinates i j the pixels surrounding P can be indexed as follows in the case of a 3 x 3 matrix P 1 0 Paj v Pasij D Pa 1 Pua Pasa P 15 1 Pajen P rtjro A linear filter assigns to Pa j a value that is a linear combination of its surrounding values For example Pap Paj Pa 1 p 2Pa p t Paript Pajen A nonlinear filter assigns to P a value that is not a linear combination of the surrounding values For example Pap max Pa 1 j 1 Parij P 1 5 1 gt P rijrn O National Instruments Corporation 5 31 IMAQ Vision Concepts Manual Chapter 5 Image Processing In the case of a 5 x 5 neighborhood the i and j indexes vary from 2 to 2 The series of pixels that includes P4 and its surrounding pixels is annotated as Pin m Linear Filte
46. The purpose of the nonlinear filters is to either extract the contours edge detection or remove the isolated pixels IMAQ Vision has six different methods you can use for contour extraction Differentiation Gradient Prewitt Roberts Sigma or Sobel The Canny Edge Detection filter is a specialized edge detection method that locates edges accurately even under low signal to noise conditions in an image To harmonize pixel values choose between two filters each of which uses a different method NthOrder and LowPass These functions require that either a kernel size and order number or percentage is specified on input Spatial filters alter pixel values with respect to variations in light intensity in their neighborhood The neighborhood of a pixel is defined by the size of a matrix or mask centered on the pixel itself These filters can be sensitive to the presence or absence of light intensity variations Spatial filters fall into two categories e Highpass filters emphasize significant variations of the light intensity usually found at the boundary of objects Highpass frequency filters help isolate abruptly varying patterns that correspond to sharp edges details and noise e Lowpass filters attenuate variations of the light intensity Lowpass frequency filters help emphasize gradually varying patterns such as objects and the background They have the tendency to smooth images by eliminating details and blurring edges 5 14 ni
47. Yn and Zn are the tri stimulus values of the reference white L represent the light intensity The hue and chroma can be calculated as follows Hue tan b a Chroma day y Based on the fact that the color space is now uniform a color difference formula can be given as the Euclidean distance between the coordinates of two colors in the CIE L a b AE HALA Aa Ab 1 24 ni com Chapter 1 Digital Images To transform CIE L a b values to RGB first convert the CIE L a b values to CIE XYZ using the following equations X Xn P a 500 Y YnP Z Zn P b 200 where P L 16 116 Then use the conversion matrix given in the RGB and CIE XYZ section to convert CIE XYZ to RGB RGB and CMY The following matrix operation converts the RGB color space to the CMY color space C 1 R M 1 G Y 1 B Normalize all color values to lie between 0 and 1 before using this conversion equation To obtain RGB values from a set of CMY values subtract the individual CMY values from 1 RGB and YIQ The following matrix operation converts the RGB color space to the YIQ color space Y 0 299 0 587 0 114 R I 0 596 0 275 0 321 G Q 0 212 0 523 0 311 B The following matrix operation converts the YIQ color space to the RGB color space 1 0 0 956 0 621 Y R G 1 0 0 272 0 647 T B 1 0 1 105 1 702 Q National Instruments Corporation 1 25 IMAQ Vision Concepts M
48. ax te Nay cN xxy Nyy 2 Hu Moment 6 He Na N pN at Neyy ANa EN y d 4AN AN ot NY XYY IN xxy Nyy Hu Moment 7 H GN axy Nyy Norr IG n cae N T ON Nyy Ny N yyy 3 Nyax NA Nyy National Instruments Corporation 10 19 IMAQ Vision Concepts Manual Particle Measurements Part IV Machine Vision This section describes conceptual information about high level operations commonly used in machine vision applications such as edge detection pattern matching dimensional measurements and color inspection Part IV Machine Vision contains the following chapters Chapter 11 Edge Detection describes edge detection techniques and tools that locate edges such as the rake concentric rake spoke and caliper Chapter 12 Pattern Matching contains information about pattern matching and shape matching Chapter 13 Dimensional Measurements contains information about analytic tools clamps line fitting and coordinate systems Chapter 14 Color Inspection contains information about the color spectrum color matching color location and color pattern matching Chapter 15 Instrument Readers contains information about meters LCDs and barcodes O National Instruments Corporation IV 1 IMAQ Vision Concepts Manual Edge Detection Introduction This chapter describes edge detection techniques and tools that locate edges such as the rake concentric rake spoke and caliper When to Use
49. be shown in a linear or logarithmic scale A logarithmic scale lets you visualize gray level values used by small numbers of pixels These values might appear unused when the histogram is displayed in a linear scale Ina logarithmic scale the vertical axis of the histogram gives the logarithm of the number of pixels per gray level value The use of minor gray level values becomes more prominent at the expense of the dominant gray level values The logarithmic scale emphasizes small histogram values that are not typically noticeable in a linear scale Figure 4 4 illustrates the difference between the display of the histogram of the same image in a linear and logarithmic scale In this particular image three pixels are equal to 0 4 4 ni com Chapter 4 Image Analysis ps a Linear Vertical Scale Nk b Logarithmic Vertical Scale Figure 4 4 Histogram of the Same Image Using Linear and Logarithmic Vertical Scales Histogram of Color Images The histogram of a color image is expressed as a series of three tables each corresponding to the histograms of the three primary components in the color model in Table 4 1 Table 4 1 Color Models and Primary Components Color Model Components RGB Red Green Blue HSL Hue Saturation Luminance Line Profile A line profile plots the variations of intensity along a line It returns the grayscale values of the pixels along a line and graphs it
50. calibration algorithm you chose is adequate IMAQ Vision returns a low quality score if you calibrate an image with high nonlinear distortion using the perspective method or if you use a sparse grid to calibrate an image with high nonlinear distortion You also can use this score to gauge whether the setup is behaving as expected For example if you are using a lens with very little lens distortion perspective calibration should produce accurate results However system setup problems such as a physically distorted calibration template may cause a low quality score regardless of your lens quality 3 12 ni com Chapter 3 System Setup and Calibration The error map is an estimate of the positional error that you can expect when you convert a pixel coordinate into a real world coordinate The error map is a 2D array that contains the expected positional error for each pixel in the image The error value of the pixel coordinate i j indicates the largest possible location error for the estimated real world coordinate x y as compared to the true real world location The following equation shows how to calculate the error value eli j Jo ms RA y piu The error value indicates the radical distance from the true real world position in which the estimated real world coordinates can live The error value has a confidence interval of 95 which implies that the positional error of the estimated real world coordinate is equal to or smal
51. com Chapter 5 Image Processing Spatial Filtering Concepts Spatial Filter Classification Summary Table 5 2 describes the different types of spatial filters Table 5 2 Spatial Filter Classifications Filter Type Filters Linear Highpass Gradient Laplacian Lowpass Smoothing Gaussian Nonlinear Highpass Gradient Roberts Sobel Prewitt Differentiation Sigma Lowpass Median Nth Order Lowpass Linear Filters A linear filter replaces each pixel by a weighted sum of its neighbors The matrix defining the neighborhood of the pixel also specifies the weight assigned to each neighbor This matrix is called the convolution kernel If the filter kernel contains both negative and positive coefficients the transfer function is equivalent to a weighted differentiation and produces a sharpening or highpass filter Typical highpass filters include gradient and Laplacian filters If all coefficients in the kernel are positive the transfer function is equivalent to a weighted summation and produces a smoothing or lowpass filter Typical lowpass filters include smoothing and Gaussian filters Gradient Filter A gradient filter highlights the variations of light intensity along a specific direction which has the effect of outlining edges and revealing texture O National Instruments Corporation 5 15 IMAQ Vision Concepts Manual Chapter 5 Image Processing IMAQ Vision Concepts Manual Given the following source
52. dimensional structure whose coefficients define how the filtered value at each pixel is computed The filtered value of a pixel is a weighted combination of its original value and the values of its neighboring pixels The convolution kernel coefficients define the contribution of each neighboring pixel to the pixel being updated The convolution kernel size determines the number of 5 10 ni com Chapter 5 Image Processing neighboring pixels whose values are considered during the filtering process In the case of a 3 x 3 kernel illustrated in Figure 5 1a the value of the central pixel shown in black is derived from the values of its eight surrounding neighbors shown in gray A 5 x 5 kernel shown in Figure 5 1b specifies 24 neighbors a 7 x 7 kernel specifies 48 neighbors and so forth 0HE O 0 lt 0 1 Kernel 2 Image Figure 5 1 Examples of Kernels A filtering operation on an image involves moving the kernel from the leftmost and topmost pixel in the image to the rightmost and bottommost point in the image At each pixel in the image the new value is computed using the values that lie under the kernel as shown in Figure 5 2 Kernel Filtering Function E Neighbors ES Central Pixel Figure 5 2 Mechanics of Filtering When computing the filtered values of the pixels that lie along the border of the
53. edge value and end edge value is returned as the edge location a 1 Pixels 3 Width 5 Contrast 2 Grayscale Values 4 Steepness 6 Edge Location Figure 11 9 Advanced Edge Detection IMAQ Vision Concepts Manual 11 8 ni com Chapter 11 Edge Detection Subpixel Accuracy When the resolution of the image is high enough most measurement applications make accurate measurements using pixel accuracy only However it is sometimes difficult to obtain the minimum image resolution needed by a machine vision application because of the limits on the size of the sensors available or affordable In these cases you need to find edge positions with subpixel accuracy Subpixel analysis is a software method that estimates the pixel values that a higher resolution imaging system would have provided To compute the location of an edge with subpixel precision the edge detection software first fits a higher order interpolating function such as a quadratic or cubic function to the pixel intensity data The interpolating function provides the edge detection algorithm with pixel intensity values between the original pixel values The software then uses the intensity information to find the location of the edge with subpixel accuracy Figure 11 10 shows how a cubic spline function fits to a set of pixel values Using this fit values at locations in between pixels are estimated The edge detection
54. equalization of the interval 166 200 can spread the information contained in the original third peak ranging from 166 to 200 to the interval 0 255 The transformed image reveals details about the component with the original intensity range 166 200 while all other components are set to black An equalization from 166 200 to 0 255 produces the following image and histograms Convolution Kernels Concepts IMAQ Vision Concepts Manual A convolution kernel defines a two dimensional filter that you can apply to a grayscale image A convolution kernel is a two dimensional structure whose coefficients define the characteristics of the convolution filter that it represents In a typical filtering operation the coefficients of the convolution kernel determine the filtered value of each pixel in the image IMAQ Vision provides a set of convolution kernels that you can use to perform different types of filtering operations on an image You also can define your own convolution kernels thus creating custom filters Use a convolution kernel whenever you want to filter a grayscale image Filtering a grayscale image enhances the quality of the image to meet the requirements of your application Use filters to smooth an image remove noise from an image enhance the edge information in an image and so on A convolution kernel defines how a filter alters the pixel values in a grayscale image The convolution kernel is a two
55. errors 15 4 ni com Kernels A kernel is a structure that represents a pixel and its relationship to its neighbors This appendix lists a number of predefined kernels supported by IMAQ Vision Gradient Kernels 3 x 3 Kernels The following tables list the predefined gradient kernels The following tables list the predefined gradient 3 x 3 kernels Prewitt Filters The Prewitt filters have the following kernels The notations West W South S East E and North N indicate which edges of bright regions they outline 0 W Edge 1 0 1 1 0 1 1 0 1 4 S Edge National Instruments Corporation Table A 1 1 W Image 1 0 1 1 1 1 1 0 1 5 S Image 1 1 1 0 1 0 1 1 1 9 E Image 1 0 1 1 1 1 1 0 1 A 1 Prewitt Filters 2 SW Edge O 1 1 1 0 1 1 1 0 6 SE Edge S omm Lem Ms 10 NE Edge Rao RO os 3 SW Image 0 1 1 1 1 1 1 1 0 7 SE Image LS 1 1 0 11 NE Image 0 1 1 jp or L IMAQ Vision Concepts Manual Appendix A Kernels Table A 1 Prewitt Filters Continued 12 N Edge 13 N Image 14NW Edge 15 NW Image 1 1 1 1 1 1 1 1 0 1 1 0 00 0 01 0 1 0 1 1 1 1 1 1 1 1 1 1 0 1 1 O 1 1 Sobel Filters The Sobel filters are very similar to the Prewitt filters except that they highlight light intensity variations along a particular axis that is assigned a stronger weight The Sobel filters have the following ker
56. fault detection Interclass Variance Interclass variance is a classical statistical technique used in discriminating factorial analysis This method is well suited for images in which classes are not too disproportionate For satisfactory results the smallest class must be at least 5 of the largest one Notice that this method tends to underestimate the class of the smallest standard deviation if the two classes have a significant variation Metric Use this technique in situations similar to interclass variance For each threshold a value is calculated that is determined by the surfaces representing the initial gray scale The optimal threshold corresponds to the smallest value Moments This technique is suited for images that have poor contrast The moments method is based on the hypothesis that the observed image is a blurred version of the theoretically binary original The blurring that is produced from the acquisition process caused by electronic noise or slight defocalization is treated as if the statistical moments average and National Instruments Corporation 8 5 IMAQ Vision Concepts Manual Chapter 8 Thresholding In Depth Discussion Automatic Thresholding Techniques IMAQ Vision Concepts Manual variance were the same for both the blurred image and the original image This function recalculates a theoretical binary image All automatic thresholding methods use the histogram of an image to determine the thres
57. image The calibration software also generates an error map An error map returns an estimate of the worst case error when a pixel coordinate is transformed into a real world coordinate Use the calibration information obtained from the calibration process to convert any pixel coordinate to its real world coordinate and back Coordinate System To express measurements in real world units you must define a coordinate system Define a coordinate system by its origin angle and axis direction Figure 3 6a shows the coordinate system of a calibration grid in the real world Figure 3 6b shows the coordinate system of an image of the corresponding calibration grid The origin expressed in pixels defines the center of your coordinate system The origins of the coordinate systems depicted in Figure 3 6 lie at the center of the circled dots The angle specifies the orientation of your coordinate system with respect to the horizontal axis in the real world Notice in Figure 3 6b that the horizontal axis automatically aligns to the top row of dots in the image of the grid The calibration procedure determines the direction of the horizontal axis in the real world which is along the topmost row of dots in the image of the grid e gt Qg Ka eoeeeeer Qe ee eoooeoec50e e 6000 0 00 oo e 06000000 ee ooe 6000000 e ee 60000090 y y y a b 1 Origin in the Real World Grid 2 Origin in the Grid Image
58. image contains information in the form of pixels Spatial calibration allows you to translate a measurement from pixel units into another unit such as inches or centimeters This conversion is easy if you know a conversion ratio between pixels and real world units For example if 1 pixel equals 1 inch a length measurement of 10 pixels equals 10 inches This conversion may not be straightforward because perspective projection and lens distortion affect the measurement in pixels Calibration accounts for possible errors by constructing mappings that you can use to convert between pixel and real world units You also can use the calibration information to correct perspective or nonlinear distortion errors for image display and shape measurements Calibrate your imaging system when you need to make accurate and reliable measurements Use the IMAQ Vision calibration tools to do the following e Calibrate your imaging setup automatically by imaging a standard pattern calibration template or by providing reference points Convert measurements lengths areas widths from real world units to pixel units and back Apply a learned calibration mapping to correct an image acquired through a calibrated setup National Instruments Corporation 3 7 IMAQ Vision Concepts Manual Chapter 3 System Setup and Calibration e Assign an arbitrary coordinate system to measure positions in real world units Make real world measurements on binary i
59. in an image with more detail such as a higher color resolution than a spectrum with fewer bins In IMAQ Vision you can choose between three color sensitivity settings low medium and high Low divides the hue color space into seven sectors giving a total of 2x 7 2 16 bins Medium divides the hue color space into 14 sectors giving a total of 2 x 14 2 30 bins High divides the hue color space into 28 sectors giving a total of 2 x 28 2 58 bins The value of each element in the color spectrum indicates the percentage of image pixels in each color bin When the number of bins is set according to the color sensitivity parameter the machine vision software scans the image counts the number of pixels that fall into each bin and stores the ratio of the count and total number of pixels in the image in the appropriate element within the color spectrum array The software also applies a special adaptive learning algorithm to determine if pixels are either black or white before assigning it to a 14 4 ni com Chapter 14 Color Inspection color bin Figure 14 4b represents the low sensitivity color spectrum of Figure 14 4a The height of each bar corresponds to the percentage of pixels in the image that fall into the corresponding bin The color spectrum contains useful information about the color distribution in the image You can analyze the color spectrum to get information such as the most dominant color in the image which is the e
60. increase the contrast in dark areas at the expense of the contrast in bright areas The following graphs show how the transformations behave The horizontal axis represents the input gray level range and the vertical axis represents the output gray level range Each input gray level value is plotted vertically and its point of intersection with the look up curve is plotted horizontally to give an output value The Logarithmic Square Root and Power 1 Y functions expand intervals containing low gray level values while compressing intervals containing high gray level values The higher the gamma coefficient Y the stronger the intensity correction The Logarithmic correction has a stronger effect than the Power 1 Y function Logarithmic and Inverse Gamma Correction Examples The following series of illustrations presents the linear and cumulative histograms of an image after various LUT transformations The more the histogram is compressed on the right the brighter the image Note Graphics on the left represent the original image graphics on the top right represent the linear histogram and graphics on the bottom right represent the cumulative histogram IMAQ Vision Concepts Manual 5 4 ni com Chapter 5 Image Processing The following graphic shows the original image and histograms A Power 1 Y transformation where Y 1 5 produces the following image and histograms
61. inspect it completely You can image these different regions by moving the object until the desired region lies under the camera or by moving the camera so that it lies above the desired region In either case each image maps to different regions in the real world You can specify a new position for the origin and orientation of the coordinate system so that the origin lies on a point on the object under inspection National Instruments Corporation 3 17 IMAQ Vision Concepts Manual Chapter 3 System Setup and Calibration IMAQ Vision Concepts Manual Figure 3 12 shows an inspection application whose objective is to determine the location of the hole in the board with respect to the corner of the board The board is on a stage that can translate in the x and y directions and can rotate about its center The corner of the board is located at the center of the stage In the initial setup shown in Figure 3 12a you define a coordinate system that aligns with the corner of the board using simple calibration Specify the origin of the coordinate system as the location in pixels of the corner of the board set the angle of the axis to 180 and set the axis direction to indirect Use pattern matching to find the location in pixels of the hole indicated by the crosshair in Figure 3 12a Convert the location of the hole in pixels to a real world location This conversion returns the real world location of the hole with respect to the defined coordin
62. interest assigned a pixel value of 1 and background assigned pixel values of 0 based on the intensities of the image pixels The number of bits n used to encode the value of a pixel For a given n a pixel can take 2 different values For example if n equals 8 bits a pixel can take 256 different values ranging from 0 to 255 If n equals 16 bits a pixel can take 65 536 different values ranging from 0 to 65 535 or 32 768 to 32 767 The level that represents the darkest an image can get See also white reference level Reduces the amount of detail in an image Blurring commonly occurs because the camera is out of focus You can blur an image intentionally by applying a lowpass frequency filter Bitmap Image file format commonly used for 8 bit and color images extension BMP Removes objects or particles in a binary image that touch the image border 1 A constant added to the red green and blue components of a color pixel during the color decoding process 2 The perception by which white objects are distinguished from gray and light objects from dark objects Temporary storage for acquired data G 2 ni com C caliper center of mass character recognition chroma chromaticity chrominance circle function closing clustering CLUT color images color space complex image National Instruments Corporation G 3 Glossary A measurement function that finds edge pairs along a specified p
63. is best described by the different sensations of color that the human brain perceives The color sensitive cells in the eye s retina sample color using three bands that correspond to red green and blue light The signals from these cells travel to the brain where they combine to produce different sensations of colors The Commission Internationale de Eclairage has defined the following sensations e Brightness The sensation of an area exhibiting more or less light e Hue The sensation of an area appearing similar to a combination of red green and blue e Colorfulness The sensation of an area appearing to exhibit more or less of its hue e Lightness The sensation of an area s brightness relative to a reference white in the scene e Chroma The colorfulness of an area with respect to a reference white in the scene e Saturation The colorfulness of an area relative to its brightness The trichromatic theory describes how three separate lights red green and blue can be combined to match any visible color This theory is based on the three color sensors that the eye uses Printing and photography use the trichromatic theory as the basis for combining three different colored dyes to reproduce colors in a scene Computer color spaces also use three parameters to define a color Most color spaces are geared toward displaying images with hardware such as color monitors and printers or toward applications that man
64. is processed Pixels along the edge of an image do not have neighbors on all four sides If you need to use a function that processes pixels based on the value of their neighboring pixels specify an image border that surrounds the image to account for these outlying pixels You define the image border by specifying a border size and the values of the border pixels The size of the border should accommodate the largest pixel neighborhood required by the function you are using The size of the neighborhood is specified by the size of a 2D array For example if a function uses the eight adjoining neighbors of a pixel for processing the size of the neighborhood 1s 3 x 3 indicating an array with three columns and three rows Set the border size to be greater than or equal to half the number of rows or columns of the 2D array rounded down to the nearest integer value For example if a function uses a 3 x 3 neighborhood the image should have a border size of at least 1 if a function uses a 5 x 5 neighborhood the image should have a border size of at least 2 In IMAQ Vision an image is created with a default border size of 3 This supports any function using up to a 7x7 neighborhood without any modification IMAQ Vision provides three ways to specify the pixel values of the image border Figure 1 3 illustrates these options Figure 1 3a shows the pixel values of an image You can set all image border pixels to zero this is the default case as sh
65. l 1 2 0 2 1 2 1 0 0 1 0 1 0 1 1 1 0 0 0 2 2 1 0 0 2 2 1 1 1 1 0 2 0 2 l 1 2 0 2 l 1 2 1 2 l 1 1 1 2 2 0 2 1 2 0 2 1 2 2 0 0 2 2 0 0 2 0 2 1 1 0 1 0 1 0 0 l 13 N Image 14 NW Edge 15 NW Image 12 N Edge 0 0 1 22 0 0 1 1 1 1 1 1 0 0 1 0 O 1 1 1 1 1 0 0 1 1 1 22 0 0 2 2 2 2 2 2 1 00000 1 1 1 1 1 2 1 2 1 1 2 0 2 0 0 2 2 0 0 1 0 0 1 0 0 2 2 1 22 2 1 22 2 1 IMAQ Vision Concepts Manual A 3 National Instruments Corporation Kernels Appendix A 7 x 7 Kernels The following table lists the predefined gradient 7 x 7 kernels Table A 4 Gradient 7 x 7 1 W Image 0 W Edge 1 2 2 0 2 3 0 1 1 1 1 1 1 1 Es ey 2 3 0 1 1 1 1 1 1 1 2 2 2 0 2 0 0 0 2 3 2 3 0 ae EE 2 2 0 1 0 2 2 2 2 1 0 1 1 0 1 3 S Image 2 S Edge 1 0 0 0 0 0 0 0 0 0 0 0 0 0 5 E Image 4 E Edge 1 2 2 l 3 2 l 0 1 2 0 1 dE E 0 l 2 0 2 2 2 2 2 1 0 2 1 3 3 3 0 0 0 2 O 1 3 2 l 0 2 0 2 l arol DD I 2 1 2 2 l 1 1 0 1 1 0 1 6 N Edge 7 N Image 1 0 0 0 0 0 0 0 0 0 0 0 0 0 ni com A 4 IMAQ Vision Concepts Manual Appen
66. matrix containing values of 1 This matrix shown below is the default structuring element for most binary and grayscale morphological transformations 1 1 1 1 1 1 1 1 1 Three factors influence how a structuring element defines which pixels to process during a morphological transformation the size of the structuring element the values of the structuring element sectors and the shape of the pixel frame Structuring Element Size The size of a structuring element determines the size of the neighborhood surrounding the pixel being processed The coordinates of the pixel being processed are determined as a function of the structuring element In Figure 9 1 the coordinates of the pixels being processed are 1 1 2 2 and 3 3 respectively The origin 0 0 is always the top left corner pixel 3x3 5x5 Figure 9 1 Structuring Element Sizes Using structuring elements requires an image border A 3 x 3 structuring element requires a minimum border size of 1 In the same way structuring elements of 5 x 5 and 7 x 7 require a minimum border size of 2 and 3 respectively Bigger structuring elements require corresponding increases 9 2 ni com Chapter 9 Binary Morphology in the image border size For more information about image borders refer to the Image Borders section of Chapter 1 Digital Images iy Note IMAQ Vision images have a default border size of 3 This border size enables you to use structuring elem
67. of occurrence of light intensity variations in the spatial domain The low frequencies correspond to smooth and gradual intensity variations found in the overall patterns of the source image The high frequencies correspond to abrupt and short intensity variations found at the edges of objects around noisy pixels and around details FFT Representation There are two possible representations of the Fast Fourier transform of an image the standard representation and the optical representation Standard Representation In the standard representation high frequencies are grouped at the center of the image while low frequencies are located at the edges The constant term or null frequency is in the upper left corner of the image The frequency range is 0 N x 0 M where M is the horizontal resolution of the image and N is the vertical resolution of the image Low High lt Low r Frequencies High High Frequencies High C D Low A Frequencies MS Low High Low O National Instruments Corporation 7 3 IMAQ Vision Concepts Manual Chapter 7 Frequency Domain Analysis B Note IMAQ Vision uses this representation to represent complex images in memory Use this representation when building an image mask Figure 7 1a shows an image Figure 7 1b shows the FFT of the same image using standard representation a Original Image b FFT in Standard Representation Figu
68. of the particle Area covering 100 the Image Area A Area Particle amp Holes Percentage of the particle Area in relation A Area to its Particle amp Holes Area Ar 100 Ratio of Equivalent Ellipse Equivalent Ellipse Major Axis divided by Eza Axes Equivalent Ellipse Minor Axis Ez Ratio of Equivalent Rect Equivalent Rect Long Side divided by Ra Sides Equivalent Rect Short Side R Factors Table 10 8 lists the IMAQ Vision particle factor measurements Table 10 8 Factors Measurement Definition Equation Elongation Factor Max Feret Diameter divided by F Equivalent Rect Short Side Feret RF The more elongated the shape of a particle the higher its elongation factor Compactness Factor Area divided by the product of A Bounding Rect Width and Bounding W H Rect Height The compactness factor belongs to the interval 0 1 IMAQ Vision Concepts Manual 10 16 ni com Chapter 10 Particle Measurements Table 10 8 Factors Continued Measurement Definition Equation Heywood Circularity Factor Perimeter divided by the P circumference of a circle with the 2 TA same area The closer the shape of a particle is to a disk the closer the Heywood circularity factor is to 1 Type Factor Factor relating area to moment of A inertia Ant flo Ls Sums Table 10 9 lists the IMAQ Vision particle sum measurements Table 10 9 Sums
69. of the structuring element equal to 1 are then referred as P e Ifthe value of one pixel P is equal to 0 then P is set to 0 else Po is set to 1 IfAND P 1 then Po 1 else Py 0 A dilation eliminates tiny holes isolated in particles and expands the particle contours according to the template defined by the structuring element This function has the opposite effect of an erosion because the dilation is equivalent to eroding the background For a given pixel Po the structuring element is centered on Po The pixels masked by a coefficient of the structuring element equal to 1 then are referred to as P Ifthe value of one pixel P is equal to 1 then Py is set to 1 else Py is set to 0 e IfOR P 1 then Po 1 else Po 0 Figure 9 12 illustrates the effects of erosion and dilation Figure 9 12a is the binary source image Figure 9 12b represents the source image after erosion and Figure 9 12c shows the source image after dilation Figure 9 12 Erosion and Dilation Functions Figure 9 13 is the source image for the examples in Tables 9 2 and 9 3 in which gray cells indicate pixels equal to 1 National Instruments Corporation 9 11 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology Figure 9 13 Source Image before Erosion and Dilation Tables 9 2 and 9 3 show how the structuring element can control the effects of erosion or dilation respectively The larger the structuring element
70. palette has a gradation from light brown to dark brown 0 is black and 255 is white Red Green Blue o 128 255 Rainbow Palette This palette has a gradation from blue to red with a prominent range of greens in the middle value range 0 is blue and 255 is red Red Green Blue 0 64 128 192 255 IMAQ Vision Concepts Manual 2 6 ni com Gradient Palette This palette has a gradation from red to white with a prominent range of light blue in the upper value range 0 is black and 255 is white Chapter 2 Display Red Green Blue o 128 192 Binary Palette This palette has 17 cycles of 15 different colors Table 2 1 illustrates these colors where g is the gray level value Table 2 1 Gray Level Values in the Binary Palette g R G B Resulting Color 1 255 0 0 Red 2 0 255 0 Green 3 0 0 255 Blue 4 255 255 0 Yellow 5 255 0 255 Purple 6 0 255 255 Aqua 7 255 127 0 Orange 8 255 0 127 Magenta 9 127 255 0 Bright green 10 127 0 255 Violet 11 0 127 255 Sky blue 12 0 255 127 Sea green O National Instruments Corporation 2 7 IMAQ Vision Concepts Manual Chapter 2 Display Table 2 1 Gray Level Values in the Binary Palette Continued g R G B Resulting Color 13 255 127 127 Rose 14 127 255 127 Spring green 15 127 127 255 Periwinkle The values O a
71. planks or detecting cracks on plastics sheets You also can locate objects in motion control applications In applications where there is a significant variance in the shape or orientation of an object particle analysis is a powerful and flexible way to search for the object You can use a combination of the measurements obtained through particle analysis to define a feature set that uniquely defines the shape of the object Particle Analysis Concepts IMAQ Vision Concepts Manual A typical particle analysis process scans through an entire image detects all the particles in the image and builds a detailed report on each particle You can use multiple parameters such as perimeter angle area and center of mass to identify and classify these particles Using multiple parameters can be faster and more effective than pattern matching in many applications Also by using different sets of parameters you can uniquely identify a feature in an image For example you could use the area of the template particle as a criterion for removing all particles that do not match it within some tolerance You then can perform a more refined search on the remaining particles using another list of parameter tolerances 111 2 ni com Part III Particle Analysis The following figure shows a sample list of parameters that you can obtain in a particle analysis application The binary image in this example was obtained by thresholding the source image and rem
72. problem in segmenting an image into particle and background regions occurs when the boundaries are not sharply demarcated In such a case the choice of a correct threshold becomes subjective Therefore you may want to enhance your images before thresholding to outline where the correct borders lie You can use lookup 8 2 ni com Chapter 8 Thresholding tables filters FFTs or equalize functions to enhance your images Observing the intensity profile of a line crossing a boundary area is also helpful in selecting a correct threshold value Finally keep in mind that morphological transformations can help you retouch the shape of binary particles and therefore correct unsatisfactory selections that occurred during the thresholding Automatic Threshold Various automatic thresholding techniques are available e Clustering Entropy e Interclass Variance e Metric e Moments In contrast to manual thresholding these methods do not require that you set the minimum and maximum light intensities These techniques are well suited for conditions in which the light intensity varies Depending on your source image it is sometimes useful to invert reverse the original grayscale image before applying an automatic threshold function such as moments and entropy This is especially true for cases in which the background is brighter than the foreground Clustering is the only multi class thresholding method available Clustering operate
73. seven segment displays that use either LCDs or LEDs seven segments composed of light emitting diodes or electroluminescent indicators The functions in this library can perform the following tasks e Detect the area around each seven segment digit from a rectangular area that contains multiple digits e Read the value of a single digit e Read the value sign and decimal separator of the displayed number LCD Algorithm Limits Four factors can cause a bad detection e Very high horizontal or vertical light drift e Very low contrast between the background and the segments 15 2 ni com Barcode Chapter 15 Instrument Readers e Very high level of noise Very low resolution of the image Each of these factors is quantified to indicate when the algorithm might not give accurate results Light drift is quantified by the difference between the average pixel values at the top left and the bottom right of the background of the LCD screen Detection results might be inaccurate when light drift is greater than 90 in 8 bit images Contrast is measured as the difference between the average pixel values in a rectangular region in the background and a rectangular region in a segment This difference must be greater than 30 in 8 bit images 256 gray levels to obtain accurate results Noise is defined as the standard deviation of the pixel values contained in a rectangular region in the background This value must be less t
74. size as the original image as shown in Figure 3 9b With the scale to preserve area option the corrected image is scaled such that features in the image retain the same area as they did in the original image as shown in Figure 3 9c Images that are scaled to preserve area are usually larger than the original image Because scaling to preserve the area increases the size of the image the processing time for the function may increase 00090 Pees e e cr 0 0 0 o 0 esse e h o eo o gt a b c Figure 3 9 Scaling Modes The scaling mode you choose depends on your application Scale to preserve the area when your vision application requires the true area of objects in the image Use scale to fit for all other vision applications 3 14 ni com Chapter 3 System Setup and Calibration Correction Region You can correct an entire image or regions in the image based on user defined ROIs or the calibration ROI defined by the calibration software Figure 3 10 illustrates the different image areas you can specify for correction IMAQ Vision learns calibration information for only the regions you specify O 1 Full Image 4 User and Calibration ROI 2 User or Calibration ROI 5 Calibration ROI 3 User ROI Figure 3 10 ROI Modes e Full Image Corrects the entire image regardless of the calibration ROI and the user defined R
75. system to gauge this parameter If the resolution of the imaging system is very high use a small pixel radius to minimize the use of outlying points in the line fit Use a higher pixel radius if your image is noisy The minimum score allows you to improve the quality of the estimated line The line fitting function removes the point furthest from the fit line and then refits a line to the remaining points and computes the MSD of the line Next the function computes a line fit score LFS for the new fit using the following equation LFS S x 1000 PR where PR is the pixel radius IMAQ Vision repeats the entire process until the score is greater than or equal to the minimum score or until the number of iterations exceeds the user defined maximum number of iterations 13 16 ni com Chapter 13 Dimensional Measurements Use a high minimum score to obtain the most accurate line fit For example combining a large pixel radius and a high minimum score produces an accurate fit within a very noisy data set A small pixel radius and a small minimum score produces a robust fit in a standard data set The maximum number of iterations defines a limit in the search for a line that satisfies the minimum score If you reach the maximum number of iterations before the algorithm finds a line matching the desired minimum score the algorithm stops and returns the current line If you do not need to improve the quality of the line in order to obtain
76. that are easy for humans to quantify Hue saturation and brightness are characteristics that distinguish one color from another in the HSL space Hue corresponds to the dominant wavelength of the color The hue component is a color such as orange green or violet You can visualize the range of hues as a rainbow Saturation refers to the amount of white added to the hue and represents the relative purity of a color A color without any white is fully saturated The degree of saturation is inversely proportional to the amount of white light added Colors such as pink red and white and lavender purple and white are less saturated than red and purple Brightness embodies the chromatic notion of luminance or the amplitude or power of light Chromaticity is the combination of hue and saturation and the relationship between chromacity and brightness characterizes a color Systems that manipulate hue use the HSL color space which also can be written as HSI Hue Saturation and Intensity or HSV Hue Saturation and Value The coordinate system for the HSL color space is cylindrical Colors are defined inside a hexcone as shown in Figure 14 1 HSL Color Space of Chapter 14 Color Inspection The hue value runs from 0 to 360 The saturation ranges from 0 to 1 where represents the purest color no white Luminance also ranges from 0 to 1 where 0 is black and 1 is white Overall two principal factors the de coupling of the intensity co
77. the desired results set the maximum iterations value to 0 in the line fit function National Instruments Corporation 13 17 IMAQ Vision Concepts Manual Color Inspection This chapter contains information about the color spectrum color matching color location and color pattern matching The Color Spectrum The color spectrum represents the three dimensional color information associated with an image or a region of an image in a concise one dimensional form that can be used by many of the IMAQ Vision color processing functions Use the color spectrum for color matching color location and color pattern matching applications with IMAQ Vision The color spectrum is a one dimensional representation of the three dimensional color information in an image The spectrum represents all the color information associated with that image or a region of the image in the HSL space The information is packaged in a form that can be used by the color processing functions in IMAQ Vision Color Space Used to Generate the Spectrum The color spectrum represents the color distribution of an image in the HSL space as shown in Figure 14 1 If the input image is in RGB format the image is first converted to HSL format and the color spectrum is computed from the HSL space Using HSL images directly such as those acquired with an IMAQ PCI PXI 1411 image acquisition device with an onboard RGB to HSL conversion for color matching improves the operat
78. the following figure represent the first pixel of each particle Center of Mass x X coordinate of the particle x 2 Center of Mass A Center of Mass y Y coordinate of the particle y 2 Center of Mass A First Pixel x X coordinate of the first particle pixel First Pixel y Y coordinate of the first particle pixel O National Instruments Corporation 10 7 IMAQ Vision Concepts Manual Chapter 10 Particle Measurements Table 10 2 Coordinates Continued Measurement Definition Symbol Equation Bounding Rect X coordinate of the leftmost BL Left particle point Bounding Rect Y coordinate of highest particle Br Top point Bounding Rect X coordinate of the rightmost Br Right particle point Bounding Rect Y coordinate of the lowest Bg Bottom particle point Max Feret X coordinate of the Max Feret Ex Diameter Startx Diameter Start Max Feret Y coordinate of the Max Feret Fy Diameter Start y Diameter Start Max Feret X coordinate of the Max Feret Es Diameter End x Diameter End Max Feret Y coordinate of the Max Feret Ey Diameter End y Diameter End Max Horiz X coordinate of the leftmost pixel Segment Length in the Max Horiz Segment Max Left Horiz Segment Length Left is always given as a pixel measurement Max Horiz X coordinate of the rightmost Segment Length pixel in the Max H
79. to the pixels added by the dilation process If J is an image external edge 1 dilation I XOR dilation 1 Figure 9 15a shows the binary source image Figure 9 15b shows the image produced from an extraction using a 5 x 5 structuring element The superimposition of the internal edge is shown in white and the external edge is shown in gray The thickness of the extended contours depends on the size of the structuring element Figure 9 15 External Edges Hit Miss Function Use the hit miss function to locate particular configurations of pixels This function extracts each pixel located in a neighborhood exactly matching the template defined by the structuring element Depending on the configuration of the structuring element the hit miss function can locate single isolated pixels cross shape or longitudinal patterns right angles along the edges of particles and other user specified shapes The larger the size of the structuring element the more specific the researched template can be Refer to Table 9 4 for strategies on using the hit miss function 9 14 ni com Chapter 9 Binary Morphology In a structuring element with a central coefficient equal to 0 a hit miss function changes all pixels set to 1 in the source image to the value 0 For a given pixel Po the structuring element is centered on Po The pixels masked by the structuring element are then referred to as P e Ifthe value of each pixel P i
80. used to inspect printed circuit boards containing a variety of components including diodes resistors integrated circuits and capacitors In a manufacturing environment color matching can find flaws in a manufactured product when the flaws are accompanied by a color change Color Matching Concepts Color matching is performed in two steps In the first step the machine vision software learns a reference color distribution In the second step the software compares color information from other images to the reference image and returns a score as an indicator of similarity IMAQ Vision Concepts Manual 14 8 ni com Chapter 14 Color Inspection Learning Color Distribution The machine vision software learns a color distribution by generating a color spectrum You provide the software with an image or regions in the image containing the color information that you want to use as a reference in your application The machine vision software then generates a color spectrum based on the information you provide The color spectrum becomes the basis of comparison during the matching phase Comparing Color Distributions During the matching phase the color spectrum obtained from the target image or region in the target image is compared to the reference color spectrum taken during the learning step A match score is computed based on the similarity between these two color spectrums using the Manhattan distance between two vectors A fuzzy member
81. where Pa is the value of pixel P in image a and P is the value of pixel P in image b O National Instruments Corporation 6 1 IMAQ Vision Concepts Manual Chapter 6 Operators IMAQ Vision Concepts Manual K Y aU Op gt f a Pp Arithmetic Operators The equations in Table 6 1 describe the usage of arithmetic operators with 8 bit resolution images a and b Table 6 1 Arithmetic Operators Operator Equation Multiply Pn min p X Pp 255 Divide Pn max pa Pp 0 Add Pn min p Pp 255 Subtract Pn Max p Pp 0 Modulo Pn pzmodp Absolute Difference Pn Pa Pb If the resulting pixel value p is negative it is set to 0 If it is greater than 255 it is set to 255 Logic and Comparison Operators Logic operators are bitwise operators They manipulate gray level values coded on one byte at the bit level The equations in Table 6 2 describe the usage of logical operators The truth tables for logic operators are presented in the Truth Tables section 6 2 ni com Chapter 6 Operators Table 6 2 Logical and Comparison Operators Operator Equation Logical Operators AND Pan Pa AND pp NAND Pn Pa NAND p OR Pn Pa OR pp NOR Pn Pa NOR p XOR Pn Pa XOR p Logic Difference Pn Pa AND NOT p Comparison Operators Mask if Pp 0 then p 0 else Pn Pa Mean Pn mean p
82. you encode your image depends on the nature of the image acquisition device the type of image processing you need to use and the type of analysis you need to perform For example 8 bit encoding is sufficient if you need to obtain the shape information of objects in an image However if you need to precisely measure the light intensity of an image or region in an image you must use 16 bit or floating point encoding Use color encoded images when your machine vision or image processing application depends on the color content of the objects you are inspecting or analyzing IMAQ Vision does not directly support other types of image encoding particularly images encoded as 1 bit 2 bit or 4 bit images In these cases IMAQ Vision automatically transforms the image into an 8 bit image the minimum bit depth for IMAQ Vision when opening the image file 1 2 ni com Chapter 1 Digital Images Number of Planes The number of planes in an image corresponds to the number of arrays of pixels that compose the image A grayscale or pseudo color image is composed of one plane while a true color image is composed of three planes one each for the red component blue component and green component In true color images the color component intensities of a pixel are coded into three different values The color image is the combination of three arrays of pixels corresponding to the red green and blue components in an RGB image HSL images are defi
83. 1 0 101 0 1 0 then Poy Paj Pa 1 Posty Pajen Nonlinear Prewitt Filter Poy maxllP 1 1 P a jon Paro Pap Parrin Po 1j4 vb IPa 1 j 1 Pe 1 j Paojro Pajn Paeorjro Perija Nonlinear Sobel Filter Pa p max P 41 0 Pa 1j 1 2P 19 2P 41 9 t Passe Pi mb lIPa 1 9 Pa 1j 9 2Pa jan 2Pa atrajo Praga Nonlinear Gradient Filter The new value of a pixel becomes the maximum absolute value between its deviation from the upper neighbor and the deviation of its two left neighbors Pa p maxllP a j 1 Papl Pai Pa 1 al Pirja ala Pij viv Pia Pij Roberts Filter The new value of a pixel becomes the maximum absolute value between the deviation of its upper left neighbor and the deviation of its two other neighbors Pa p maxlIP 1 9 Pap Paj Pa 1pl Pit je Pi j Pia Pij National Instruments Corporation 5 33 IMAQ Vision Concepts Manual Chapter 5 Image Processing IMAQ Vision Concepts Manual Differentiation Filter The new value of a pixel becomes the absolute value of its maximum deviation from its upper left neighbors Poy max PG_1 Papl Piar Pan Paj Papl Pit j t Pi j Pin o P ij Sigma Filter If Pa p M gt S Then Pa p Pap Else Pas M Given M the mean value of Pa and its neighbors and S their standard deviation each pixel P j is set to the mean value M if it falls inside the
84. 2 illustrates how an IMAQ Vision image is represented in your system memory In addition to the pixels of the image the stored image includes additional rows and columns of pixels called the image border and the left and right alignments Specific processing functions involving pixel neighborhood operations use image borders The alignment regions ensure that the first pixel of the image is 8 byte aligned in memory The size of the alignment blocks depend on the image width and border size Aligning the image increases processing speed by as much as 30 The line width is the total number of pixels in a horizontal line of an image The line width is the sum of the horizontal resolution the image borders and the left and right alignments The horizontal resolution and line width may be the same length if the horizontal resolution is a multiple of 8 bytes and the border size is 0 1 6 ni com Chapter 1 Digital Images I y 1 Image 5 Horizontal Resolution 2 Image Border 6 Right Alignment 3 Vertical Resolution 7 Line Width 4 Left Alignment Figure 1 2 Internal Image Representation O National Instruments Corporation 1 7 IMAQ Vision Concepts Manual Chapter 1 Digital Images Image Borders IMAQ Vision Concepts Manual Many image processing functions process a pixel by using the values of its neighbors A neighboris a pixel whose value affects the value of a nearby pixel when an image
85. 3 Dimensional Measurements Making Measurements on the Image After you have located points in the image you can make distance or geometrical measurements based on those points Distance Measurements Make distance measurements using one of the following methods Measure the distance between points found by one of feature detection methods e Measure the distance between two edges of an object using the clamp functions available in IMAQ Vision Clamp functions measure the separation between two edges in a rectangular region using the rake function First the clamp functions detect points along the two edges using the rake function They then compute the distance between the detected points and return the largest or smallest distance Use the clamp functions to perform the following functions Find the smallest or largest horizontal separation between two vertically oriented edges Find the smallest or largest vertical separation between two horizontally oriented edges Figure 13 9 illustrates how a clamp function finds the minimum distance between edges of an object 1 Rectangular Search Region 4 Line Fit to Edge Points 2 Search Lines for Edge Detection 5 Measured Distance 3 Detected Edge Points Figure 13 9 Clamp Function IMAQ Vision Concepts Manual 13 12 ni com Chapter 13 Dimensional Measurements Analytic Geometry You can make the following geometrical measurements from the feature poin
86. 56 RGB elements A specific color is the result of applying a value between 0 and 255 for each 2 4 ni com Chapter 2 Display of the three color components red green and blue If the red green and blue components have an identical value the result is a gray level pixel value A gray palette associates different shades of gray with each value so as to produce a linear and continuous gradation of gray from black to white You can set up the palette to assign the color black to the value O and white to 255 or vice versa Other palettes can reflect linear or nonlinear gradations going from red to blue light brown to dark brown and so on IMAQ Vision has five predefined color palettes Each palette emphasizes different shades of gray In Depth Discussion The following sections introduce the five predefined palettes available in IMAQ Vision The graphs in each section represent the color tables used by each palette The horizontal axes of the graphs represent the input gray level range 0 255 and the vertical axes represent the RGB intensities assigned to a given gray level value Gray Palette This palette has a gradual gray level variation from black to white Each value is assigned to an equal amount of red green and blue in order to produce a gray level Red Green Blue National Instruments Corporation 2 5 IMAQ Vision Concepts Manual Chapter 2 Display Temperature Palette This
87. B H 256tan V2 V1 27 S 255 1 3min R G B R G B 1 20 ni com Chapter 1 Digital Images The following equations map the HSL color space to the RGB color space 277 256 s 58 255 s 1 s 3 FM 1 s cos h cos 7 3 h 3 h H b s r f h 0 lt h lt 2z 3 g 1 r b h h 27 1 3 r s g f h b 1 r g 27 3 lt h lt 47 3 h h 47 3 g 5 b fh r 1 g b 47 3 lt h lt 211 l 0 299 r 0 587 g 0 114 b U L l R rl G el B b RGB and CIE XYZ The following 3 x 3 matrix converts RGB to CIE XYZ 0 412453 0 357580 0 180423 R X Y 0 212671 0 715160 0 072169 G Z 0 019334 0 119193 0 950227 B National Instruments Corporation 1 21 IMAQ Vision Concepts Manual Chapter 1 Digital Images IMAQ Vision Concepts Manual By projecting the tristimulus values on to the unit plane X Y Z 1 color can be expressed in a 2D plane The chromaticity coordinates are defined as follows x X X Y Z y Y X Y Z z Z X Y Z You can obtain z from x and y by z 1 x y Hence chromaticity coordinates are usually given as x y only The chromaticity values depend on the hue or dominant wavelength and the saturation Chromaticity values are independent of luminance The diagram from x y is referred to as the CIE 1931 chromaticity diagram or the CIE x y chromaticity diagram as shown in the bell curve of Figure 1 10 1 22 ni com Chapter 1 Digital Images
88. Chapter 14 Color Inspection IMAQ Vision Concepts Manual Grayscale Pattern Matching IMAQ Vision grayscale pattern matching methods incorporate image understanding techniques to interpret the template information and use that information to find the template in the image Image understanding refers to image processing techniques that generate information about the features of a template image These methods include the following e Geometric modeling of images e Efficient non uniform sampling of images Extraction of rotation independent template information IMAQ Vision uses a combination of the edge information in the image and an intelligent image sampling technique to match patterns The image edge content provides information about the structure of the image in a compact form The intelligent sampling technique extracts points from the template that represent the overall content of the image The edge information and the smart sampling method reduce the inherently redundant information in an image and improve the speed and accuracy of the pattern matching tool In cases where the pattern can be rotated in the image a similar technique is used but with specially chosen template pixels whose values or relative change in values reflect the rotation of the pattern The result is fast and accurate grayscale pattern matching IMAQ Vision pattern matching accurately locates objects in conditions where they vary in size 5 and orientatio
89. D J2D Connectivity 4 A pixel belongs to a particle if it is located a distance of D from another pixel in the particle In other words two pixels are considered to be part of the same particle if they are horizontally or vertically adjacent They are considered as part of two different particles if they are diagonally adjacent In Figure 9 10 the particle count equals 4 9 8 ni com Chapter 9 Binary Morphology OOOO lO SLLLS O A ke Figure 9 10 Connectivity 4 Connectivity 8 A pixel belongs to a particle if it is located a distance of D or 2D from another pixel in the particle In other words two pixels are considered to be part of the same particle if they are horizontally vertically or diagonally adjacent In Figure 9 11 the particle count equals 1 OOOO lO OOOO O OOO OL ol 90 Figure 9 11 Connectivity 8 Primary Morphology Operations When to Use Primary morphological operations work on binary images to process each pixel based on its neighborhood Each pixel is set either to 1 or 0 depending on its neighborhood information and the operation used These operations always change the overall size and shape of particles in the image Use the primary morphological operations for expanding or reducing particles smoothing the borders of objects finding the externa
90. D is a dilation O is an opening C is a closing F 1 is the image obtained after applying the function F to the image J and GF 1 is the image obtained after applying the function F to the image followed by the function G to the image 7 5 40 ni com Chapter 5 Image Processing Proper Closing Concept and Mathematics If I is the source image the proper closing function extracts the maximum value of each pixel between the source image and its transformed image obtained after a closing followed by an opening and followed by another closing proper closing l max COC or proper closing I maxU EDDEED where Tis the source image E is an erosion Disa dilation O is an opening C is a closing F D is the image obtained after applying the function F to the image J and GF J is the image obtained after applying the function F to the image followed by the function G to the image Auto Median Concept and Mathematics If Tis the source image the auto median function extracts the minimum value of each pixel between the two images obtained by applying a proper opening and a proper closing of the source image auto median I min OCO COC or auto median I min DEEDDE EDDEED J where Tis the source image E is an erosion Dis a dilation O is an opening C is a closing F D is the image obtained after applying the function F to the image J and GF D is the image obtained after applying the f
91. Figure 12 8 shows how binary shape matching is used to sort windshield wiper parts into different shapes Figure 12 8a shows the shape template Figure 12 8b shows the original grayscale image of different windshield parts Figure 12 8c shows the binary or thresholded version of the original image Figure 12 8d shows the output of the shape matching function The shape matching function finds the shape you want in the image regardless of variations in size and orientation a b 6 d Figure 12 8 Using Shape Matching to Search for Windshield Wiper Parts IMAQ Vision Concepts Manual 12 10 ni com Dimensional Measurements This chapter contains information about coordinate systems analytic tools and clamps Introduction You can use dimensional measurement or gauging tools in IMAQ Vision to obtain quantifiable critical distance measurements such as distances angles areas line fits circular fits and counts to determine if a certain product was manufactured correctly Components such as connectors switches and relays are small and manufactured in high quantity Human inspection of these components is tedious time consuming and inconsistent IMAQ Vision can quickly and consistently measure certain features on these components and generate a report of the results If the gauged distance or count does not fall within user specified tolerance limits the component or part fails to meet production specifications
92. Gray Value Maximum intensity value in gray level units e Mean Gray Value Mean intensity value in the particle expressed in gray level units e Standard Deviation Standard deviation of the intensity values 4 6 ni com Image Processing This chapter contains information about lookup tables convolution kernels spatial filters and grayscale morphology Lookup Tables The lookup table LUT transformations are basic image processing functions that highlight details in areas containing significant information at the expense of other areas These functions include histogram equalization gamma corrections logarithmic corrections and exponential corrections When to Use Use LUT transformations to improve the contrast and brightness of an image by modifying the dynamic intensity of regions with poor contrast LUT Transformation Concepts A LUT transformation converts input gray level values those from the source image into other gray level values in the transformed image A LUT transformation applies the transform T x over a specified input range rangeMin rangeMax in the following manner T x dynamicMin if x lt rangeMin f x if rangeMin lt x lt rangeMax dynamicMax if x gt rangeMax where x represents the input gray level value dynamicMin 0 8 bit images or the smallest initial pixel value 16 bit and floating point images dynamicMax 255 8 bit images or the largest initial pixel value 16 bi
93. IMAQ IMAQ Vision Concepts Manual Y NATIONAL June 2003 Edition gt INSTRUMENTS Part Number 322916B 01 Worldwide Technical Support and Product Information ni com National Instruments Corporate Headquarters 11500 North Mopac Expressway Austin Texas 78759 3504 USA Tel 512 683 0100 Worldwide Offices Australia 1800 300 800 Austria 43 0 662 45 79 90 0 Belgium 32 0 2 757 00 20 Brazil 55 11 3262 3599 Canada Calgary 403 274 9391 Canada Montreal 514 288 5722 Canada Ottawa 613 233 5949 Canada Qu bec 514 694 8521 Canada Toronto 905 785 0085 Canada Vancouver 514 685 7530 China 86 21 6555 7838 Czech Republic 420 2 2423 5774 Denmark 45 45 76 26 00 Finland 385 0 9 725 725 11 France 33 0 1 48 14 24 24 Germany 49 0 89 741 31 30 Greece 30 2 10 42 96 427 India 91 80 51190000 Israel 972 0 3 6393737 Italy 39 02 413091 Japan 81 3 5472 2970 Korea 82 02 3451 3400 Malaysia 603 9131 0918 Mexico 001 800 010 0793 Netherlands 31 0 348 433 466 New Zealand 1800 300 800 Norway 47 0 66 90 76 60 Poland 48 0 22 3390 150 Portugal 351 210 311 210 Russia 7 095 238 7139 Singapore 65 6226 5886 Slovenia 386 3 425 4200 South Africa 27 0 11 805 8197 Spain 34 91 640 0085 Sweden 46 0 8 587 895 00 Switzerland 41 56 200 51 51 Taiwan 886 2 2528 7227 Thailand 662 992 7519 United Kingdom 44 0 1635 523545 For further support information refer to the Technical Support and Professional Services appendix To comment on the documentatio
94. Instruments Corporation 5 23 IMAQ Vision Concepts Manual Chapter 5 Image Processing IMAQ Vision Concepts Manual Given the following source image a smoothing filter produces the following image Kernel Definition A smoothing convolution filter is an averaging filter whose kernel uses the following model acres axa a Soa where a b c and d are positive integers and x 0 or 1 Because all the coefficients in a smoothing kernel are positive each central pixel becomes a weighted average of its neighbors The stronger the weight of a neighboring pixel the more influence it has on the new value of the central pixel For a given set of coefficients a b c d a smoothing kernel with a central coefficient equal to 0 x 0 has a stronger blurring effect than a smoothing kernel with a central coefficient equal to 1 x 1 Notice the following smoothing kernels and filtered images A larger kernel size corresponds to a stronger smoothing effect 5 24 ni com Chapter 5 Image Processing Kernel Filtered Image Kernel A Filtered Image 0 1 0 1 0 1 0 1 0 Kernel B Filtered Image DADO 2 T 2 2 2 2 Kernel C Filtered Image 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 Kernel D Filtered Image 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 Gaussian Filters A Gaussian filter attenuates the variations of light intensi
95. MAQ Vision to find vertically or horizontally oriented lines These functions use the rake and concentric rake functions to find a set of points along the edge of an object and then fit a line through the edge National Instruments Corporation 13 9 IMAQ Vision Concepts Manual Chapter 13 Dimensional Measurements IMAQ Vision Concepts Manual Refer to Chapter 11 Edge Detection for more information on the rake and concentric rake functions The line fitting method is described later in this chapter Figure 13 6 illustrates how a rake finds a straight edge 1 Search Region 3 Detected Edge Points 2 Search Lines 4 Line Fit to Edge Points Figure 13 6 Finding a Straight Feature Use the circle detection function to locate circular edges This function uses a spoke to find points on a circular edge and then fits a circle on the detected points Figure 13 7 illustrates how a spoke finds circular edges 13 10 ni com Chapter 13 Dimensional Measurements 1 Annular Search Region 3 Detected Edge Points 2 Search Lines 4 Circle Fit to Edge Points Figure 13 7 Circle Detection Shape Based Features Use pattern matching or color pattern matching to find features that are better described by the shape and grayscale or color content than the boundaries of the part Figure 13 8 Finding Shape Features O National Instruments Corporation 13 11 IMAQ Vision Concepts Manual Chapter 1
96. OE This set of equations can be expressed so that the sum R R and the product R R become functions of the parameters Particle Area and Particle Perimeter R and R then become the two solutions of the following polynomial equation 2x Px 2A 0 Notice that for a given area and perimeter only one solution Ra Rp exists Equivalent Rect Short Side Length of the short side of the rectangle that has the same area and same perimeter as a particle This length is equal to Rp e Equivalent Rect Diagonal Distance between opposite corners of the Equivalent Rect R R e Rectangle Ratio Ratio of the long side of the equivalent rectangle to its short side which is defined as rectangle long side _ Ra rectangle short side R The more elongated the equivalent rectangle the higher the rectangle ratio The closer the equivalent rectangle is to a square the closer to 1 the rectangle ratio IMAQ Vision Concepts Manual 10 12 ni com Chapter 10 Particle Measurements Hydraulic radius A disk with radius R has a hydraulic radius equal to disk area _ TR _R disk perimeter 2nR 3 Areas Table 10 4 lists the IMAQ Vision particle area measurements Table 10 4 Areas Measurement Definition Symbol Equation Area Area of the particle A Holes Area Sum of the areas of each Ay hole in the particle Particle amp Holes Area of a particle that Ar A Ay Area completely covers the
97. OI e User or Calibration ROI Corrects pixels in both the user defined ROI and the calibration ROI e User ROI Corrects only the pixels inside the user defined ROI specified during the learn calibration phase e User and Calibration ROI Corrects only the pixels that lie in the intersection of the user defined ROI and the calibration ROI e Calibration ROI Corrects only the pixels inside the calibration ROI The calibration ROI is computed by the calibration algorithm National Instruments Corporation 3 15 IMAQ Vision Concepts Manual Chapter 3 System Setup and Calibration The valid coordinate indicates whether the pixel coordinate you are trying to map to a real world coordinate lies within the image region you corrected For example 1f you corrected only the pixels within the calibration ROI but you try to map a pixel outside the calibration ROI to real world coordinates the Corrected Image Learn ROI parameter indicates an error Simple Calibration When your camera axis is perpendicular to the image plane and lens distortion is negligible you can use simple calibration to calibrate your imaging setup In simple calibration a pixel coordinate is transformed to a real world coordinate through scaling in the x and y horizontal and vertical directions Simple calibration maps pixel coordinates to real world coordinates directly without a calibration grid The software rotates and scales a pixel coordinate according to
98. ON ERRORS SOFTWARE AND HARDWARE COMPATIBILITY PROBLEMS MALFUNCTIONS OR FAILURES OF ELECTRONIC MONITORING OR CONTROL DEVICES TRANSIENT FAILURES OF ELECTRONIC SYSTEMS HARDWARE AND OR SOFTWARE UNANTICIPATED USES OR MISUSES OR ERRORS ON THE PART OF THE USER OR APPLICATIONS DESIGNER ADVERSE FACTORS SUCH AS THESE ARE HEREAFTER COLLECTIVELY TERMED SYSTEM FAILURES ANY APPLICATION WHERE A SYSTEM FAILURE WOULD CREATE A RISK OF HARM TO PROPERTY OR PERSONS INCLUDING THE RISK OF BODILY INJURY AND DEATH SHOULD NOT BE RELIANT SOLELY UPON ONE FORM OF ELECTRONIC SYSTEM DUE TO THE RISK OF SYSTEM FAILURE TO AVOID DAMAGE INJURY OR DEATH THE USER OR APPLICATION DESIGNER MUST TAKE REASONABLY PRUDENT STEPS TO PROTECT AGAINST SYSTEM FAILURES INCLUDING BUT NOT LIMITED TO BACK UP OR SHUT DOWN MECHANISMS BECAUSE EACH END USER SYSTEM IS CUSTOMIZED AND DIFFERS FROM NATIONAL INSTRUMENTS TESTING PLATFORMS AND BECAUSE A USER OR APPLICATION DESIGNER MAY USE NATIONAL INSTRUMENTS PRODUCTS IN COMBINATION WITH OTHER PRODUCTS IN A MANNER NOT EVALUATED OR CONTEMPLATED BY NATIONAL INSTRUMENTS THE USER OR APPLICATION DESIGNER IS ULTIMATELY RESPONSIBLE FOR VERIFYING AND VALIDATING THE SUITABILITY OF NATIONAL INSTRUMENTS PRODUCTS WHENEVER NATIONAL INSTRUMENTS PRODUCTS ARE INCORPORATED IN A SYSTEM OR APPLICATION INCLUDING WITHOUT LIMITATION THE APPROPRIATE DESIGN PROCESS AND SAFETY LEVEL OF SUCH SYSTEM OR APPLICATION Contents About This Manual CONVEMIONS nan a
99. Vision Concepts Manual Chapter 7 Frequency Domain Analysis IMAQ Vision Concepts Manual After lowpass truncation with f fo 20 finax fo spatial frequencies outside the truncation range fo f are removed The part of the central peak that remains is identical to the one in the original FFT plane Highpass FFT Filters A highpass FFT filter attenuates or removes low frequencies present in the FFT plane It has the effect of suppressing information related to slow variations of light intensities in the spatial image In this case the Inverse FFT command produces an image in which overall patterns are attenuated and details are emphasized H u v Highpass Attenuation Highpass attenuation applies a linear attenuation to the full frequency range increasing from the maximum frequency fmax to the null frequency fo This is done by multiplying each frequency by a coefficient C which is a function of its deviation from the fundamental and maximum frequencies OE na fo where C fo 1 and CUimax 0 7 8 ni com Chapter 7 Frequency Domain Analysis 1 C f f 0 max Highpass Truncation Highpass truncation removes a frequency f if it is lower than the cutoff or truncation frequency f This is done by multiplying each frequency f by a coefficient C equal to 1 or 0 depending on whether the frequency fis greater than the truncation frequency fe If F lt f then Cif 0 else
100. YIQ color space 1 19 coordinate system definitions 1 1 edge based functions 13 5 image borders 1 8 pattern matching based image file formats 1 5 functions 13 7 image masks 1 10 steps for defining 13 4 image types when to use 13 3 complex images 1 5 finding features or measurement points grayscale images 1 4 edge based features 13 9 internal representation of IMAQ Vision line and circular features 13 9 image 1 6 overview 13 2 properties shape based features 13 11 image definition 1 2 finding part of image in region of image resolution 1 2 interest 13 2 number of planes 1 3 making measurements overview 1 2 analytic geometry 13 13 digital particles distance measurements 13 12 analysis concepts III 2 line fitting 13 13 angles 10 14 overview 13 2 areas 10 13 overview 13 1 coordinates 10 7 qualifying measurements 13 3 factors 10 16 when to use 13 1 lengths 10 9 direction gradient filter 5 17 measurement concepts 10 3 display moments 10 18 image display pixel versus real world basic concepts 2 1 measurements 10 1 display modes 2 2 quantities 10 14 mapping methods for 16 bit image ratios 10 16 display 2 2 when to use 2 1 National Instruments Corporation 1 5 IMAQ Vision Concepts Manual Index nondestructive overlay 2 10 palettes basic concepts 2 4 Binary palette 2 7 Gradient palette 2 7 Gray palette 2 5 Rainbow palette 2 6 Temperature palette 2 6 when to use 2 4 regions of interest
101. above but superimposes them over the source image The transformed image looks like the source image with edges highlighted Use this type of kernel for grain extraction and perception of texture Source Image Prewitt 15 Filtered Image 1 1 0 1 1 1 0 1 1 ni com Chapter 5 Image Processing Notice that Prewitt 15 can be decomposed as follows 1 1 0 1 1 0 0 0 0 1 1 1 1 0 1 0 1 0 0 1 1 0 1 1 0 0 0 ays Note The convolution filter using the second kernel on the right side of the equation reproduces the source image All neighboring pixels are multiplied by O and the central pixel remains equal to itself Pa 1 x Po p This equation indicates that Prewitt 15 adds the edges extracted by the Kernel C to the source image Prewitt 15 Prewitt 14 Source Image Edge Thickness The larger the kernel the thicker the edges The following image illustrates gradient west east 3 x 3 National Instruments Corporation 5 19 IMAQ Vision Concepts Manual Chapter 5 Image Processing Finally the following image illustrates gradient west east 7 x 7 Laplacian Filters A Laplacian filter highlights the variation of the light intensity surrounding a pixel The filter extracts the contour of objects and outlines details Unlike the gradient filter it is omnidirectional Given the following source image a Laplacian filter extracts contours to produce the following image
102. age Do not use this function when the central coefficient of the structuring element is equal to 1 In such cases the hit miss function can turn only certain particle pixels from 1 to 0 However the addition of the thickening function resets these pixels to 1 If Tis an image thickening 1 I hit miss OR J hit missU IMAQ Vision Concepts Manual 9 18 ni com Chapter 9 Binary Morphology Figure 9 20a represents the binary source file used in the following thickening example Figure 9 20b shows the result of the thickening function applied to the source image which filled single pixel holes using the following structuring element 1 1 1 1 0 1 1 1 1 Figure 9 20 Thickening Function Figure 9 21a shows the source image for another thickening example Figures 9 21b through 9 21d illustrate the results of three thickenings applied to the source image Each thickening uses a different structuring element which is specified on top of each transformed image Gray cells indicate pixels equal to 1 Figure 9 21 Thickening Function with Different Structuring Elements National Instruments Corporation 9 19 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology IMAQ Vision Concepts Manual Proper Opening Function The proper opening function is a finite and dual combination of openings and closings It removes small particles and smooths the contour of particles according to the template d
103. age or regions in the image to the learned color spectrum calculating a score for each region This score relates how closely the color information in the image region matches the information represented by the color spectrum To use color matching you need to know the location of the objects in the image before performing the match Color location functions extend the capabilities of color matching to applications where you do not know the location of the objects in the image Color location uses the color information from a template image to look for occurrences of the template in the search image The basic operation moves the template across the image pixel by pixel and compares the color information at the current location in the image to the color information in the template using the color matching algorithm Because searching an entire image for color matches is time consuming the color location software uses some techniques to speed up the location process A coarse to fine search strategy finds the rough locations of the matches in the image A more refined search using a hill climbing algorithm is then performed around each match to get the accurate location of the match Color location is an efficient way to look for occurrences of regions in an image with specific color attributes Refer to the Color Matching and Color Location sections for more information National Instruments Corporation 14 21 IMAQ Vision Concepts Manual
104. age used in this example Figure 9 30b shows the results after the convex hull function is applied to the image Figure 9 30 Convex Hull Function O National Instruments Corporation 9 31 IMAQ Vision Concepts Manual Particle Measurements Introduction This chapter contains information about characterizing particles in a binary image When to Use A particle is a group of contiguous nonzero pixels in an image Particles can be characterized by measures relating to attributes such as particle location area and shape Use particle measurements when you want to make shape measurements on particles in a binary image Pixel versus Real World Measurements In addition to making conventional pixel measurements IMAQ Vision particle analysis functions can use calibration information attached to an image to make measurements in calibrated real world units In applications that do not require the display of corrected images you can use the calibration information attached to the image to make real world measurements directly without using time consuming image correction In pixel measurements a pixel is considered to have an area of one square unit located entirely at the center of the pixel In calibrated measurements a pixel is a polygon with corners defined as plus or minus one half a unit from the center of the pixel Figure 10 1 illustrates this concept O National Instruments Corporation 10 1 IMAQ Vision C
105. alent Rect Short Side Feret digital particles 10 10 equivalent rectangle measurements digital particles 10 12 erosion function binary morphology basic concepts 9 10 examples 9 11 structure element effects table 9 12 ni com grayscale morphology concept and mathematics 5 39 examples 5 36 purpose and use 5 36 error map output of calibration function 3 12 example code B 1 exponential and gamma correction basic concepts 5 6 examples 5 7 summary table 5 3 external edge function binary morphology 9 14 F factors digital particles 10 16 Fast Fourier Transform FFT See also frequency filters definition 7 1 FFT display 7 12 FFT representation optical representation 7 4 standard representation 7 3 Fourier Transform concepts 7 11 feature in pattern matching 12 1 fiducials definition 12 1 example of common fiducial figure 12 2 field of view definition 3 3 relationship between pixel resolution and field of view figure 3 4 filters See convolution kernels frequency filters spatial filters Fourier Transform 7 11 See also Fast Fourier Transform FFT frame See pixel frame shape National Instruments Corporation 1 7 Index frequency filters definition 7 1 Fast Fourier Transform concepts FFT display 7 12 FFT representation 7 3 Fourier Transform 7 11 overview 7 3 FFT representation optical representation 7 4 standard representation 7 3 highpass FFT filter
106. algorithms use these values to estimate the location of an edge with subpixel accuracy Gray Level amp Pixels a 1 Known Pixel Value 3 Interpolated Value 2 Interpolating Function 4 Subpixel Location Figure 11 10 Obtaining Subpixel Information Using Interpolation National Instruments Corporation 11 9 IMAQ Vision Concepts Manual Chapter 11 Edge Detection IMAQ Vision Concepts Manual With the imaging system components and software tools available today you can reliably estimate one fourth subpixel accuracy However the results of the estimation depend heavily on the imaging setup such as lighting conditions and the camera lens Before resorting to subpixel information try to improve the image resolution For more information on improving your images see your IMAQ Vision user manual Extending Edge Detection to Two Dimensional Search Regions The edge detection tool in IMAQ Vision works on a one dimensional profile You can extend the use of the edge detection tool to some other two dimensional search areas with one of the following tools Rake Spoke and Concentric Rake In these edge detection variations the two dimensional search area is covered by a number of search lines over which the edge detection is performed You can control the number of the search lines used in the search region by defining the separation between the lines Rake Rake works on a rectangular search region Th
107. alue k and the vertical axis of the cumulative histogram indicates the percentage of pixels set to a value less than or equal to k Linear Histogram The density function is A inear K Ny where Fy inear k is the number of pixels equal to k The probability function is Pinear k M M where Prinear Kk is the probability that a pixel is equal to k Figure 4 2 Sample of a Linear Histogram gt k Cumulative Histogram The distribution function is k H cumulk y A i 0 where A cum k is the number of pixels that are less than or equal tok National Instruments Corporation 4 3 IMAQ Vision Concepts Manual Chapter 4 Image Analysis Interpretation Histogram Scale IMAQ Vision Concepts Manual The probability function is k A P cumu y pa i 0 where P cumukK is the probability that a pixel is less than or equal to k HeumulK eae OS i Figure 4 3 Sample of a Cumulative Histogram The gray level intervals featuring a concentrated set of pixels reveal the presence of significant components in the image and their respective intensity ranges In Figure 4 2 the linear histogram reveals that the image is composed of three major elements The cumulative histogram of the same image in Figure 4 3 shows that the two left most peaks compose approximately 80 of the image while the remaining 20 corresponds to the third peak The vertical axis of a histogram plot can
108. an be transformed into a spatial image f x y of resolution NM using the following formula N 1 M 1 an HE 22 2 f xy gt 5 Fo a u 0 v 0 In the discrete domain the Fourier Transform is calculated with an efficient algorithm called the Fast Fourier Transform FFT i N 1 M 1 j2n 4 2 F uv y ye x 0 y 0 where N x Mis the resolution of the spatial image f x y j2Tux Because e cos2mux jsin2Tux F u v is composed of an infinite sum of sine and cosine terms Each pair u v determines the frequency of its corresponding sine and cosine pair For a given set u v notice that all values f x y contribute to F u v Because of this complexity the FFT calculation is time consuming Given an image with a resolution N x M and given Ax and Ay the spatial step increments the FFT of the source image has the same resolution NM and its frequency step increments Au and Av which are defined in the following equations _ 1 Aa 1 y Nx Ax Mx Ay Au National Instruments Corporation 7 11 IMAQ Vision Concepts Manual Chapter 7 IMAQ Vision Concepts Manual Frequency Domain Analysis FFT Display An FFT image can be visualized using any of its four complex components real part imaginary part magnitude and phase The relation between these components is expressed by F u v R u v I u v where R u v is the real part and u v is the imaginary part and F u v
109. an example of a set of points located by the rake function As shown in the figure a typical line fitting algorithm that uses all of the points to fit a line returns inaccurate results The line fitting function in IMAQ Vision compensates for outlying points in the dataset and returns a more accurate result IMAQ Vision uses the following process to fit a line IMAQ Vision assumes that a point is part of a line if the point lies within a user defined distance pixel radius from the fitted line Then the line fitting algorithm fits a line O National Instruments Corporation 13 13 IMAQ Vision Concepts Manual Chapter 13 Dimensional Measurements to a subset of points that fall along an almost straight line IMAQ Vision determines the quality of the line fit by measuring its mean square distance MSD which is the average of the squared distances between each point and the estimated line Figure 13 11 shows how the MSD is calculated Next the line fitting function removes the subset of points from the original set IMAQ Vision repeats these steps until all points have been fit Then the line fitting algorithm finds the line with the lowest MSD which corresponds to the line with the best quality The function then improves the quality of the line by successively removing the furthest points from the line until a user defined minimum score is obtained or a user specified maximum number of iterations is exceeded The result of the line fitting func
110. and should be rejected When to Use Use gauging for applications in which inspection decisions are made on critical dimensional information obtained from image of the part Gauging is often used in both inline and offline production During inline processes each component is inspected as it is manufactured Inline gauging inspection is often used in mechanical assembly verification electronic packaging inspection container inspection glass vile inspection and electronic connector inspection You also can use gauging to measure the quality of products off line A sample of products is extracted from the production line Then using measured distances between features on the object IMAQ Vision determines if the sample falls within a tolerance range Gauging techniques also allow you to measure the distance between particles and edges in binary images and easily quantify image measurements National Instruments Corporation 13 1 IMAQ Vision Concepts Manual Chapter 13 Dimensional Measurements Dimensional Measurements Concepts IMAQ Vision Concepts Manual The gauging process consists of the following four steps 1 Locate the component or part in the image 2 Locate features in different areas of the part 3 Make measurements using these features 4 Compare the measurements to specifications to determine if the part passes inspection Locating the Part in the Image A typical gauging application extracts measurements f
111. and then make selected measurements of those regions These regions are commonly referred to as particles A particle is a contiguous region of nonzero pixels You can extract particles from a grayscale image by thresholding the image into background and foreground states Zero valued pixels are in the background state and all nonzero valued pixels are in the foreground Particle analysis consists of a series of processing operations and analysis functions that produce information about particles in an image Using particle analysis you can detect and analyze any two dimensional shape in an image O National Instruments Corporation 111 1 IMAQ Vision Concepts Manual Part III Particle Analysis When to Use Use particle analysis when you are interested in finding particles whose spatial characteristics satisfy certain criteria In many applications where computation is time consuming you can use particle filtering to eliminate particles that are of no interest based on their spatial characteristics and keep only the relevant particles for further analysis You can use particle analysis to find statistical information such as the particle size or the number location and the presence of particles This information allows you to perform many machine vision inspection tasks such as detecting flaws on silicon wafers detecting soldering defects on electronic boards or web inspection applications such as finding structural defects on wood
112. ange your working distance modify the distance so that you can select a lens with the appropriate focal length and minimize distortion Contrast Resolution and contrast are closely related factors contributing to image quality Contrast defines the differences in intensity values between the object under inspection and the background Your imaging system should have enough contrast to distinguish objects from the background Proper lighting techniques can enhance the contrast of your system Depth of Field The depth of field of a lens is its ability to keep objects of varying heights in focus If you need to inspect objects with various heights chose a lens that can maintain the image quality you need as the objects move closer to and further from the lens Perspective Perspective errors often occur when the camera axis is not perpendicular to the object you are inspecting Figure 3 3a shows an ideal camera position Figure 3 3b shows a camera imaging an object from an angle National Instruments Corporation 3 5 IMAQ Vision Concepts Manual Chapter 3 System Setup and Calibration 1 Lens Distortion 2 Perspective Error 3 Known Orientation Offset Figure 3 3 Camera Angle Relative to the Object under Inspection Perspective errors appear as changes in the object s magnification depending on the object s distance from the lens Figure 3 4a shows a grid of dots Figure 3 4b illustrates perspective err
113. anges and lighting changes The color pattern matching tool maintains its ability to locate the reference patterns and gives accurate results despite these changes Pattern Orientation and Multiple Instances A color pattern matching tool locates the reference pattern in an image even when the pattern in the image is rotated and slightly scaled When a pattern is rotated or scaled in the image the color pattern matching tool detects the following features of an image The pattern in the image e The position of the pattern in the image e The orientation of the pattern e Multiple instances of the pattern in the image if applicable O National Instruments Corporation 14 19 IMAQ Vision Concepts Manual Chapter 14 Color Inspection Figure 14 14a shows a template image or pattern Figures 14 14b and 14 14c illustrate multiple occurrences of the template Figure 14 14b shows the template shifted in the image Figure 14 14c shows the template rotated in the image m p R34 R29 R38 a HE i e a ie y He d Figure 14 14 Pattern Orientation Ambient Lighting Conditions The color pattern matching tool finds the reference pattern in an image under conditions of uniform changes in the lighting across the image Because color analysis is more robust when dealing with variations in lighting than grayscale processing color pattern matching performs better under conditions of non uniform lig
114. anges of pixel values in grayscale and color images that separate the objects under consideration from the background When to Use Use thresholding to extract areas that correspond to significant structures in an image and to focus the analysis on these areas Image Histogram Threshold Interval ry 0 166 255 Pixels outside the threshold interval are set to 0 and are considered as part of the background area Pixels inside the threshold interval are set to 1 and are considered as part of a particle area National Instruments Corporation 8 1 IMAQ Vision Concepts Manual Chapter 8 Thresholding Thresholding Concepts IMAQ Vision Concepts Manual Intensity Threshold Particles are characterized by an intensity range They are composed of pixels with gray level values belonging to a given threshold interval overall luminosity or gray shade All other pixels are considered to be part of the background The threshold interval is defined by the two parameters Lower Threshold and Upper Threshold All pixels that have gray level values equal to or greater than the Lower Threshold and equal to or smaller than the Upper Threshold are selected as pixels belonging to particles in the image Thresholding Example This example uses the following source image Highlighting the pixels that belong to the threshold interval 166 255 the brightest areas produces the following image A critical and frequent
115. anual Display This chapter contains information about image display palettes regions of interest and nondestructive overlays Image Display Displaying images is an important component of a vision application because it gives you the ability to visualize your data Image processing and image visualization are distinct and separate elements Image processing refers to the creation acquisition and analysis of images Image visualization refers to how image data is presented and how you can interact with the visualized images A typical imaging application uses many images in memory that the application never displays Image Display Concepts Depending on your application development environment you can display images in the following image display environments an external window LabVIEW and LabWindows CVD the LabVIEW Image Display control LabVIEW 7 0 or higher and the CWIMAQViewer ActiveX control Visual Basic Display functions display images set attributes of the image display environment assign color palettes to image display environments close image display environments and set up and use an image browser in image display environments Some region of interest ROD functions a subset of the display functions interactively define ROIs in image display environments These ROI functions configure and display different drawing tools detect draw events retrieve information about the region drawn on the image display
116. ase the contrast in images that do not use all gray levels The equalization can be limited to a gray level interval also called the equalization range In this case the function evenly distributes the pixels belonging to the equalization range over the full interval 0 to 255 for an 8 bit image and the other pixels are set to 0 The image produced reveals details in the regions that have an intensity in the equalization range other areas are cleared 5 8 ni com Chapter 5 Image Processing Equalization Example 1 This example shows how an equalization of the interval 0 255 can spread the information contained in the three original peaks over larger intervals The transformed image reveals more details about each component in the original image The following graphics show the original image and histograms ay Note In Examples 1 and 2 graphics on the left represent the original image graphics on the top right represent the linear histogram and graphics on the bottom right represent the cumulative histogram An equalization from 0 255 to 0 255 produces the following image and histograms ky Note The cumulative histogram of an image after a histogram equalization always has a linear profile as seen in the preceding example National Instruments Corporation 5 9 IMAQ Vision Concepts Manual Chapter 5 Image Processing Equalization Example 2 This example shows how an
117. ate Systemi nois ita 3 9 Calibration Algorithms sseessesessesseerssessssrsresrsresresrsresrsresresesresesee 3 11 IMAQ Vision Concepts Manual vi ni com Contents Calibration Quality InformatiON concnocnnnnnocnnnnnonnnnncononnncnncnnncanonnnono 3 12 Image Cormectonis cestos dd 3 13 Scaling Mode iniiciconsco cian gtee idos ted iaa raid d hese dias 3 14 Correction Regios vts titres 3 15 Simple Calibration icons 3 16 Redefining a Coordinate SySteM oooconcnocnnocnoononncononnonncnnncnncnnorononnnoo 3 17 Part Il Image Processing and Analysis Chapter 4 Image Analysis ISCO STAT pees ste adestd ited Pea tds diri 4 1 When to Us ut a tra 4 1 Histograma Concepts iia 4 2 Linear Histotias 4 3 Cumulative HAStOSTAM ett tdt ia init 4 3 Interpretation aaa Re 4 4 Histogram Scala ai 4 4 Histogram of Color Images ooooconoccnccoocconcnaconnconononcnnonononnnonncon conc cana nn ncancrnncnnons 4 5 Tame Profile lit date 4 5 When to Use ui ia sae 4 5 Intensity Measurements pei A as 4 6 When to Uses ncccatiaiiny Wass eh siche IR See ae 4 6 CONCEPTS tudio dd 4 6 Densitometry ic eta tesla Pid 4 6 Chapter 5 Image Processing Lookup Tables iaa Gioia Saodiens aatee A OT R a een R 5 1 When to Usen ii lia dani aaa ie 5 1 LUT Transformation CONCeptS cocococnoonconononcnncnnnonncnnncnnonnnnononn cnn nonn crac can eisit 5 1 Example ica lc is 5 2 Predefined Lookup Tables ooooonconicnnoniocnnononnconcnononnnonncnononncanccononnnoo
118. ate system During the inspection process the stage may translate and rotate by a known amount This causes the board to occupy a new location in the camera s field of view which makes the board appear translated and rotated in subsequent images as shown in Figure 3 12b Because the board has moved the original coordinate system no longer aligns with the corner of the board Therefore measurements made using this coordinate system will be inaccurate Use the information about how much the stage has moved to determine the new location of the corner of the board in the image and update the coordinate system using simple calibration to reflect this change The origin of the updated coordinate reference system becomes the new pixel location of the corner of the board and the angle of the coordinate system is the angle by which the stage has rotated 3 18 ni com Chapter 3 System Setup and Calibration The updated coordinate system is shown in Figure 3 12c Measurements made with the new coordinate system are accurate Figure 3 12 Moving Coordinate System O National Instruments Corporation 3 19 IMAQ Vision Concepts Manual Part Il Image Processing and Analysis This section describes conceptual information about image analysis and processing operators and frequency domain analysis Part II Image Processing and Analysis contains the following chapters Chapter 4 Image Analysis contains information about h
119. ath in the image This function performs an edge extraction and then finds edge pairs based on specified criteria such as the distance between the leading and trailing edges edge contrasts and so forth The point on an object where all the mass of the object could be concentrated without changing the first moment of the object about any axis The ability of a machine to read human readable text The color information in a video signal The combination of hue and saturation The relationship between chromaticity and brightness characterizes a color See chroma Detects circular objects in a binary image A dilation followed by an erosion A closing fills small holes in objects and smooths the boundaries of objects Technique where the image is sorted within a discrete number of classes corresponding to the number of phases perceived in an image The gray values and a barycenter are determined for each class This process is repeated until a value is obtained that represents the center of mass for each phase or class Color lookup table Table for converting the value of a pixel in an image into a red green and blue RGB intensity Images containing color information usually encoded in the RGB form The mathematical representation for a color For example color can be described in terms of red green and blue hue saturation and luminance or hue saturation and intensity Stores information obtained from the FFT of an ima
120. automatic clustering 8 3 8 7 entropy 8 5 8 7 in depth discussion 8 6 interclass variance 8 5 8 8 metric 8 5 8 8 moments 8 5 8 9 techniques 8 6 color 8 9 example 8 2 intensity threshold 8 2 overview 8 1 when to use 8 1 TIFF tagged image file format 1 5 training customer B 1 transforming color spaces See color spaces O National Instruments Corporation Index trichromatic theory of color 1 14 troubleshooting resources B 1 truncation highpass FFT filters 7 9 lowpass FFT filters 7 6 truth tables 6 4 two dimensional edge detection See edge detection type factor digital particles 10 17 W Web professional services B 1 technical support B 1 working distance definition 3 3 worldwide technical support B 1 X XOR operator table 6 3 Y YIQ color space description 1 19 transforming RGB to YIQ 1 25 IMAQ Vision Concepts Manual
121. ch as the ones in the illustration Prewitt 10 Prewitt 2 0 1 1 0 1 1 1 0 1 1 0 1 1 1 0 1 1 0 Prewitt 2 highlights pixels where the light intensity increases along the direction going from southwest to northeast It darkens pixels where the light intensity decreases along that same direction This processing outlines the southwest front edges of bright regions such as the ones in the illustration National Instruments Corporation 5 17 IMAQ Vision Concepts Manual Chapter 5 3 IMAQ Vision Concepts Manual Edge Extraction and Edge Highlighting The gradient filter has two effects depending on whether the central coefficient x is equal to 1 or 0 Note Applying Prewitt 10 to an image returns the same results as applying Prewitt 2 to its photometric negative because reversing the lookup table of an image converts bright regions into dark regions and vice versa e Ifthe central coefficient is null x 0 the gradient filter highlights the pixels where variations of light intensity occur along a direction specified by the configuration of the coefficients a b c and d The transformed image contains black white borders at the original edges and the shades of the overall patterns are darkened Source Image Prewitt 14 Filtered Image 1 1 0 1 0 1 0 1 1 e Ifthe central coefficient is equal to 1 x 1 the gradient filter detects the same variations as mentioned
122. cif 1 1 Cif 0 fo fo fmax The following series of graphics illustrates the behavior of both types of highpass filters They give the 3D view profile of the magnitude of the FFT This example uses the following original FFT image National Instruments Corporation 7 9 IMAQ Vision Concepts Manual Chapter 7 Frequency Domain Analysis After highpass attenuation the central peak has been removed and variations present at the edges remain After highpass truncation with f fo 20 fmax fo spatial frequencies inside the truncation range fo f are set to 0 The remaining frequencies are identical to the ones in the original FFT plane Mask FFT Filters A mask FFT filter removes frequencies contained in a mask specified by the user Depending on the mask definition this filter can behave as a lowpass bandpass highpass or any type of selective filter H u v IMAQ Vision Concepts Manual 7 10 ni com Chapter 7 Frequency Domain Analysis In Depth Discussion Fourier Transform The spatial frequencies of an image are calculated by a function called the Fourier Transform It is defined in the continuous domain as co 00 F u v ay y 00 o0 where fx y is the light intensity of the point x y and u v are the horizontal and vertical spatial frequencies The Fourier Transform assigns a complex number to each set u v Inversely a Fast Fourier Transform F u v c
123. ckground area The labeling function identifies particles using either connectivity 4 or connectivity 8 criteria For more information on connectivity refer to the Connectivity section Lowpass and Highpass Filters The lowpass filter removes small particles according to their widths as specified by a parameter called filter size For a given filter size N the lowpass filter eliminates particles whose widths are less than or equal to N 1 pixels These particles disappear after N 1 2 erosions The highpass filter removes large particles according to their widths as specified by a parameter called filter size For a given filter size N the highpass filter eliminates particles with widths greater than or equal to N pixels These particles do not disappear after N 2 1 erosions Both the highpass and lowpass morphological filters use erosions to select particles for removal Since erosions or filters cannot discriminate particles with widths of 2k pixels from particles with widths of 2k 1 pixels a single erosion eliminates both particles that are 2 pixels wide and 1 pixel wide Table 9 5 shows the effect of lowpass and highpass filtering Table 9 5 Effect of Lowpass and Highpass Filtering Filter Size N N is an even number N 2k Highpass Filter Lowpass Filter e Removes particles with a width Removes particles with a width greater than or equal to 2k less than or equal to 2k 2 e Uses k 1 e
124. cs of the intensity changes to determine which changes constitute an edge 1 Search Lines 2 Edges Figure 11 4 Examples of Edges Located on a Bracket 11 4 ni com Chapter 11 Edge Detection Characteristics of an Edge Figure 11 5 shows a common model that is used to characterize an edge Gray Level Intensities 2 gt Search Direction 1 Grayscale Profile 3 Edge Strength 2 Edge Length 4 Edge Location Figure 11 5 Edge Model The following list includes the main parameters of this model Edge strength Defines the minimum difference in the grayscale values between the background and the edge The edge strength is also called the edge contrast Figure 11 6 shows an image that has different edge strengths The strength of an edge can vary because of the following reasons Lighting conditions If the overall light in the scene is low the edges in image will have low strengths Figure 11 6 illustrates a change in the edge strength along the boundary of an object relative to different lighting conditions Objects with different grayscale characteristics The presence of a very bright object causes other objects in the image with a lower overall intensity to have edges with smaller strengths National Instruments Corporation 11 5 IMAQ Vision Concepts Manual Chapter 11 Edge Detection a b Cc F
125. ct from the background Color 14 16 ni com Chapter 14 Color Inspection pattern matching algorithms provide a quick way to locate objects when color is present Use color pattern matching when the object under inspection has the following qualities e The object contains color information that is very different from the background and you want to find the location of the object in the image very precisely For these applications color pattern matching provides a more accurate solution than color location because color location does not use shape information during the search phase finding the locations of the matches with pixel accuracy is very difficult e The object has grayscale properties that are very difficult to characterize or that are very similar to other objects in the search image In such cases grayscale pattern matching may not give accurate results If the object has some color information that differentiates it from the other objects in the scene color provides the machine vision software with the additional information to locate the object Figure 14 12 illustrates the advantage of using color pattern matching over color location to locate the resistors in an image Although color location finds the resistors in the image the matches are not very accurate because they are limited to color information Color pattern matching uses color matching first to locate the objects and then pattern matching to refine the l
126. d 8 bit for the hue 8 bit for the 8 bit for the saturation luminance Complex 8 bytes or 64 bit 32 bit floating for the real part 32 bit for the imaginary part Grayscale Images A grayscale image is composed of a single plane of pixels Each pixel is encoded using one of the following single numbers An 8 bit unsigned integer representing grayscale values between 0 and 255 e A 16 bit signed integer representing grayscale values between 32 768 and 32 767 e A single precision floating point number encoded using four bytes representing grayscale values ranging from to oo IMAQ Vision Concepts Manual 1 4 ni com Chapter 1 Digital Images Color Images A color image is encoded in memory as either an RGB or HSL image Color image pixels are a composite of four values RGB images store color information using 8 bits each for the red green and blue planes HSL images store color information using 8 bits each for hue saturation and luminance In all of the color models an additional 8 bit value goes unused This representation is known as 4 x 8 bit or 32 bit encoding Alpha plane not used Red or hue plane _ Green or saturation plane C Blue or luminance plane C Complex Images A complex image contains the frequency information of a grayscale image Create a complex image by applying a Fast Fourier transform FFT to a grayscale image When you transform a grayscale image into a complex
127. d by the smallest feature you need to inspect Try to have at least two pixels represent the smallest feature You can use the following equation to determine the minimum pixel resolution required by your imaging system length of object s longest axis size of object s smallest feature x 2 If the object does not occupy the entire field of view the image size will be greater than the pixel resolution Resolution indicates the amount of object detail that the imaging system can reproduce Images with low resolution lack detail and often appear National Instruments Corporation 3 3 IMAQ Vision Concepts Manual Chapter 3 System Setup and Calibration blurry Three factors contribute to the resolution of your imaging system field of view the camera sensor size and number of pixels in the sensor When you know these three factors you can determine the focal length of your camera lens Field of View The field of view is the area of the object under inspection that the camera can acquire Figure 3 2 describes the relationship between pixel resolution and the field of view Wiov Wrov hoa po Poy LOL fov w E IMAQ Vision Concepts Manual Figure 3 2 Relationship between Pixel Resolution and Field of View Figure 3 2a shows an object that occupies the field of view Figure 3 2b shows an object that occupies less space than the field of view If w is th
128. defining 2 9 types of contours table 2 9 when to use 2 8 distance function binary morphology 9 28 distance measurements 13 12 distortion description 3 7 perspective and distortion errors figure 3 6 Divide operator table 6 2 documentation conventions used in the manual xv online library B 1 related documentation xvi drivers instrument B 1 software B 1 E edge detection 11 1 characteristics of edge common model figure 11 5 edge length parameter 11 6 edge polarity parameter 11 6 edge position parameter 11 6 edge strength parameter 11 5 definition of edge 11 4 in pattern matching 12 7 IMAQ Vision Concepts Manual methods for edge detection advanced 11 7 simple 11 7 subpixel accuracy 11 9 overview 11 1 two dimensional search regions Concentric Rake function 11 12 Rake 11 10 Spoke 11 11 when to use alignment 11 3 detection 11 2 dimensional measurements 13 9 gauging 11 2 edge outlining with gradient filters edge extraction and highlighting 5 18 edge thickness 5 19 edge based coordinate system functions single search area 13 5 two search areas 13 6 ellipse measurements digital particles ellipse ratio measurement 10 11 equivalent ellipse axes 10 11 equivalent ellipse axes measurement 10 11 elongation factor measurement 10 16 entropy technique in automatic thresholding in depth discussion 8 7 overview 8 5 Equalize function basic concepts 5 8 examples 5 9 Equiv
129. deos and interactive CDs You also can register for instructor led hands on courses at locations around the world e System Integration If you have time constraints limited in house technical resources or other project challenges NI Alliance Program members can help To learn more call your local NI office or visit ni com alliance If you searched ni com and could not find the answers you need contact your local office or NI corporate headquarters Phone numbers for our worldwide offices are listed at the front of this manual You also can visit the Worldwide Offices section of ni com niglobal to access the branch office Web sites which provide up to date contact information support phone numbers email addresses and current events National Instruments Corporation B 1 IMAQ Vision Concepts Manual Glossary Numbers Symbols 1D 2D 3D 3D view AIPD alignment alpha channel area area threshold arithmetic operators array auto median function One dimensional Two dimensional Three dimensional Displays the light intensity of an image in a three dimensional coordinate system where the spatial coordinates of the image form two dimensions and the light intensity forms the third dimension National Instruments internal image file format used for saving complex images and calibration information associated with an image extension APD The process by which a machine vision application determi
130. dgren ea n e a a a 11 4 Characteristics of an Ed88 oooononcninnnonnonnnonconcnnncnnonnncnnornncononnncanonnnons 11 5 Edge Detection Methods oooooonnccnccnocanoncnnnnoncnnnnnncnn cnn nonononccnnccncnnccnnos 11 6 Extending Edge Detection to Two Dimensional A sieve ss Haeseeictsuaesevbos Mi a a E aaa Ei 11 10 National Instruments Corporation Xi IMAQ Vision Concepts Manual Contents Chapter 12 Pattern Matching Introductio N son A td 12 1 When to Use iaa ii a Cate Ga 12 1 Pattern Matching Concepts oocococcoccconconcnoncnnncononnnonncononononnnonnonn ean raai 12 3 What to Expect from a Pattern Matching Tool eee 12 3 Pattern Matching Techniques ecceesesssceseeeeeecececeeeceseeecessneeeeecsaeeeeeesas 12 4 Traditional Pattern Matching oooooccnncconccnonoconnconancnnncnnncnonncnnnncnnccannss 12 4 New Pattern Matching Techniques ooocccocnnnnnonnnnnnoncnnnnnnconcnnnonananonno 12 6 In Depth DISCUSSION re EE 12 7 Cross Correlations sss isc iscsscsie santas accent acts niece ERE S E etter 12 7 Shape Matchim cita tne ed pie ed Bn eee sees 12 9 When to USE csi ii A pee A sy 12 9 Shape Matching Concepts ee oeira i aier ee E e EEE E EEEE 12 10 Chapter 13 Dimensional Measurements Introduction id 13 1 Whento Us iio it 13 1 Dimensional Measurements Concepts ocooooconoccocononcnnconnnononnnonnconccnnonnconccnornnos 13 2 Locating the Part in the IMag8 ococonnnicnnncnonnnoncnnnnoncnncnnncnncnnncnncnnnnno 13 2 Eocating Featu
131. dix A Kernels Laplacian Kernels The following tables list the predefined Laplacian kernels Table A 5 Laplacian 3 x 3 0 Contour 4 1 Imagex1 2 Imagex2 0 1 0 0 1 0 0 1 0 1 4 1 1 5 1 1 6 1 0 1 0 0 1 0 0 1 0 3 Contour 8 4 Imagex1 5 Imagex2 1 1 1 1 1 1 1 1 1 1 8 1 1 9 1 1 10 1 1 1 1 1 1 1 1 1 1 6 Contour 12 7 Imagex1 1 2 1 1 2 1 2 12 2 2 13 2 1 2 1 1 2 1 Table A 6 Laplacian 5 x 5 0 Contour 24 1 Imagex1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 24 1 1 1 1 25 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 Table A 7 Laplacian 7 x 7 0 Contour 48 1 Imagex1 1 1 1 1 1 1 l 1 1 1 1 1 1 1 1 1 1 1 1 1 l 1 1 1 1 1 1 l 1 1 1 1 1 1 l 1 1 1 1 1 1 l 1 1 1 48 1 1 l 1 1 1 49 1 1 1 1 1 1 1 1 1 l 1 1 1 1 1 1 l 1 1 1 1 1 1 l 1 1 1 1 1 1 1 1 1 1 1 1 1 l 1 1 1 1 1 1 l National Instruments Corporation A 5 IMAQ Vision Concepts Manual Kernels Appendix A Smoothing Kernels The following tables list the predefined smoothing kernels Table A 8 Smoothing 3 x 3 1 Table A 9 Smoothing 5 x 5 Table A 10 Smoothing 7 x 7 ni com A 6 IMAQ Vision Concepts Manual Appendix A Kernels Gaussian Kernels The following tables list the predefined Gaussian kernels Table A 11 Gaussian 3 x 3
132. e You also can carry out correlation in the frequency domain using the FFT If the image and the template are the same size this approach is more efficient than performing correlation in the spatial domain In the frequency domain correlation is obtained by multiplying the FFT of the image by the complex conjugate of the FFT of the template Normalized cross correlation is considerably more difficult to implement in the frequency domain O National Instruments Corporation 12 5 IMAQ Vision Concepts Manual Chapter 12 Pattern Matching IMAQ Vision Concepts Manual New Pattern Matching Techniques Classical pattern matching methods have their limitations New methods such as the ones used in IMAQ Vision try to incorporate image understanding techniques to interpret the template information and then use this information to find the template in the image Image understanding refers to image processing techniques that generate information about the features of a template image These techniques include the following e Geometric modeling of images e Efficient non uniform sampling of images e Extraction of template information that is rotation independent and scale independent These techniques reduce the amount of information needed to fully characterize an image or pattern which speeds up the searching process Also extracting useful information from a template and removing the redundant and noisy information provides a more accurate
133. e format AIPD 1 6 neighbors pixels definition 1 8 IMAQ Vision Concepts Manual 1 12 noise See blur and noise conditions nondestructive overlay basic concepts 2 10 when to use 2 10 nonlinear algorithm for calibration 3 11 nonlinear filters classes table 5 15 differentiation filter mathematical concepts 5 34 overview 5 29 gradient filter mathematical concepts 5 33 overview 5 28 in depth discussion 5 33 lowpass filter mathematical concepts 5 34 overview 5 29 median filter mathematical concepts 5 34 overview 5 30 Nth order filter effects table 5 30 mathematical concepts 5 34 overview 5 30 Prewitt filter description 5 27 example 5 28 mathematical concepts 5 33 predefined kernels A 1 Roberts filter mathematical concepts 5 33 overview 5 29 Sigma filter mathematical concepts 5 34 overview 5 29 Sobel filter description 5 27 example 5 28 mathematical concepts 5 33 predefined kernels A 2 ni com nonlinear gradient filter definition 5 28 NOR operator table 6 3 normalized moments of inertia digital particles 10 5 Nth order filter basic concepts 5 30 examples table 5 30 mathematical concepts 5 34 number of holes 10 14 number of horizontal segments digital particles 10 14 number of planes 1 3 number of vertical segments digital particles 10 14 0 online technical support B 1 opening function binary morphology basic concepts 9 13 examples 9 13 grayscale morp
134. e A gray level or pseudo color image is composed of one plane while an RGB image is composed of three planes one for the red component one for the blue and one for the green The coordinate position in an image where you want to place the origin of another image Setting an offset is useful when performing mask operations An erosion followed by a dilation An opening removes small objects and smooths boundaries of objects in the image Allow masking combination and comparison of images You can use arithmetic and logic operators in IMAQ Vision A machine vision application that inspects the quality of printed characters Contains the low frequency information at the center and the high frequency information at the corners of an FFT transformed image Finds the outer boundary of objects The gradation of colors used to display an image on screen usually defined by a color lookup table A connected region or grouping of pixels in an image in which all pixels have the same intensity level G 14 ni com particle analysis pattern matching picture aspect ratio picture element pixel pixel aspect ratio pixel calibration pixel depth PNG power 1 Y function power Y function Prewitt filter probability function proper closing proper opening pts O National Instruments Corporation G 15 Glossary A series of processing operations and analysis functions that produce some information about the pa
135. e Figure 12 la shows part of a circuit board Figure 12 1b shows a common fiducial used in Printed Circuit Board PCB inspections or chip pick and place applications XY T oe Jl iinn Na 1408 eNnunni STE sencecsecacens MTSC2568 Figure 12 1 Example of a Common Fiducial Gauging applications locate and then measure or gauge the distance between these objects If the measurement falls within a tolerance range the part is considered good If it falls outside the tolerance range the component is rejected Searching and finding a feature is the key processing task that determines the success of many gauging applications such as inspecting the leads on a quad pack or inspecting an antilock brake sensor In real time applications search speed is critical IMAQ Vision Concepts Manual 12 2 ni com Chapter 12 Pattern Matching Pattern Matching Concepts What to Expect from a Pattern Matching Too Because pattern matching is the first step in many machine vision applications it should work reliably under various conditions In automated machine vision applications the visual appearance of materials or components under inspection can change because of factors such as orientation of the part scale changes and lighting changes The pattern matching tool maintains its ability to locate the reference patterns despite these changes The following sections describe commonly occurring situations under which the pattern ma
136. e iai XV Related DocuMentatiQOM ccccccccccnnnnnnnnnnnanananananononnnononononononononananannnn non nono nonnnnnnncnonananinon xvi Part Chapter 1 Digital Images Definition ota Digital Mas tied ected con ae as 1 1 Properties of a Digitized Image ooonccnnnccinccnonoconcconcnonnnconccnnnnconnconnc nono connn nan ran aseeseensa 1 2 Image Resol tdi 1 2 Image Definitions dailies ees have Rite wR aera ids 1 2 Nuimber of Planes ito ias s 1 3 Image LPS cee ese ve es feces cians tages hed ceeds a aateeh odbevees 1 3 Grayscale IMAGES its 1 4 Color Mares eri 1 5 Complex mages tii tigen cn aunts alos Sates ate ees 1 5 Image Files cient tiated Sao Ad 1 5 Internal Representation of an IMAQ Vision Image eee eeeeeeeeereeeeeeeeeeeeseeaes 1 6 Image Borders oi it ies Sees Ree Wl A ee 1 8 Image Maske snra a V R ici 1 10 When to Use solani a tinea a A e ETETE 1 10 Eo Tel o S EE EI E A E E ETO EET E EEE TAT 1 10 Color paces A a RR TEE 1 13 When to Us usais E tee a 1 13 Concepts tual AE id 1 14 The RGB Color Space titles 1 15 HSE Color Space MiGs Seas be Meee E 1 17 CIE XYZ Color Space ramita iii 1 17 CIE E a b Color pace ici a 1 19 EMX Color Spaces resented 1 19 YIQ Color Spaces cscs cate alien essa 1 19 In Depth Discussion eoe c e o is eee 1 20 RGB to Grayscale east are Se en a E nS 1 20 RGB and OS tii 1 20 RGB and CIE XYZ we Gass RG ah etn a eee 1 21 RGB and CIE Lra tpi near ete uate sopet ones sees 1 24 National Instrumen
137. e size of the smallest feature in the x direction and h is the size of the smallest feature in the y direction the minimum x pixel resolution is Whov 9 w and the minimum y pixel resolution is h LV se 5 Choose the larger pixel resolution of the two for your imaging application Note In Figure 3 2b the image size is larger than the pixel resolution 3 4 ni com Chapter 3 System Setup and Calibration Sensor Size and Number of Pixels in the Sensor The camera sensor size is important in determining your field of view which is a key element in determining your minimum resolution requirement The sensor s diagonal length specifies the size of the sensor s active area The number of pixels in your sensor should be greater than or equal to the pixel resolution Choose a camera with a sensor that satisfies your minimum resolution requirement Lens Focal Length When you determine the field of view and appropriate sensor size you can decide which type of camera lens meets your imaging needs A lens is defined primarily by its focal length The relationship between the lens field of view and sensor size is as follows focal length sensor size X working distance field of view If you cannot change the working distance you are limited in choosing a focal length for your lens If you have a fixed working distance and your focal length is short your images may appear distorted However if you have the flexibility to ch
138. e A 14 10 Identificate atari 14 11 What to Expect from a Color Location Tool ooconncnncnonnocnnocnonnnoncnncnnncnncrnnonnos 14 12 Pattern Orientation and Multiple Instances eee eects eters 14 13 Ambient Lighting Conditions oooconocninnnnnconnnoncnnonancnn conc cnn conc cnnconncnnos 14 13 Blur and Noise Conditions ooonoccoconocconconnnononnnonnconnnnncnncrnncnn oro ncnna canon 14 14 Color Location Concept icc nct st nena eet O ae R 14 14 Color Pattern Mat corta deta 14 16 When to Userin teens Sechelt 14 16 What to Expect from a Color Pattern Matching Tool oooconcncnninnicinonconcnnncnnos 14 19 Pattern Orientation and Multiple Instances ee eee 14 19 Ambient Lighting Conditions oooccnccnonnnoncnnnonncnnnnoncnn conc cnn cnnn cnn nonccnnos 14 20 Blur and Noise Conditions ooooooccoccnoccnnconnnoncnnnnnnonnnonncnncrnncnn cra ncancnnnons 14 21 Color Pattern Matching Concepts cece eeceeseesseeseceeeeseeeeeeseseeeeaeeseeeaeeseees 14 21 Color Matching and Color Location ooocononocnconnnonnnnconanancnn cnn canonnnono 14 21 Grayscale Pattern Matching oooonconccnonnnoninnnnnnconnnnncononnncnncnnccnnornnonnnns 14 22 Combining Color Location and Grayscale Pattern Matching 14 22 Chapter 15 Instrument Readers TtTO UCI wiii a e NIE esos Ela shoes eee ie 15 1 When to Use ita ti daa 15 1 Meter FUNCION Str did E EEE 15 1 Meter Algorithm Limits oooonconnnninnnncnoconancnnconncono non cnn conc cnn conc cnn co
139. e and giving each object a unique identity Laboratory Virtual Instrument Engineering Workbench Program development environment application based on the programming language G used commonly for test and measurement applications Extracts the contours of objects in the image by highlighting the variation of light intensity surrounding a pixel Measures the distance between selected edges with high precision subpixel accuracy along a line in an image For example this function can be used to measure distances between points and edges This function also can step and repeat its measurements across the image Represents the gray level distribution along a line of pixels in an image A special algorithm that calculates the value of a pixel based on its own pixel value as well as the pixel values of its neighbors The sum of this calculation is divided by the sum of the elements in the matrix to obtain a new pixel value Expand low gray level information in an image while compressing information from the high gray level ranges Increases the brightness and contrast in dark regions of an image and decreases the contrast in bright regions of the image The image operations AND NAND OR XOR NOR XNOR difference mask mean max and min Compression in which the decompressed image is identical to the original image G 11 IMAQ Vision Concepts Manual Glossary lossy compression lowpass attenuation lowpass FFT filter lowpass
140. e search lines are drawn parallel to the orientation of the rectangle Control the number of lines in the area by specifying the search direction as left to right or right to left for a horizontally oriented rectangle Specify the search direction as top to bottom or bottom to top for a vertically oriented rectangle Figure 11 11 illustrates the basics of the Rake function 11 10 ni com Chapter 11 Edge Detection A gt D a b 1 Search Area 2 Search Line 3 Search Direction 4 Edge Points Figure 11 11 Rake Function Spoke Spoke works on an annular search region working along search lines that are drawn from the center of the region to the outer boundary and that fall within the search area Control the number of lines in the region by specifying the angle between each line Specify the search direction as either going from the center outward or from the outer boundary to the center Figure 11 12 illustrates the basics of the Spoke function National Instruments Corporation 11 11 IMAQ Vision Concepts Manual Chapter 11 Edge Detection 1 Search Area 2 Search Line 3 Search Direction 4 Edge Points Figure 11 12 Spoke Function Concentric Rake Concentric Rake works on an annular search region The concentric rake is an adaptation of the Rake to an annular
141. e speed requirements of many applications Pyramidal Matching You can improve the computation time by reducing the size of the image and the template One such technique is called pyramidal matching In this method both the image and the template are sub sampled to smaller spatial resolutions The image and the template can be reduced to one fourth their original size Matching is performed first on the reduced images Because the images are smaller matching is faster When the matching is complete only areas with a high match need to be considered as matching areas in the original image Scale Invariant Matching Normalized cross correlation is a good technique for finding patterns in an image when the patterns in the image are not scaled or rotated Typically cross correlation can detect patterns of the same size up to a rotation of 5 to 10 Extending correlation to detect patterns that are invariant to scale changes and rotation is difficult For scale invariant matching you must repeat the process of scaling or resizing the template and then perform the correlation operation This adds a significant amount of computation to your matching process Normalizing for rotation is even more difficult If a clue regarding rotation can be extracted from the image you can simply rotate the template and do the correlation However 1f the nature of rotation is unknown looking for the best match requires exhaustive rotations of the templat
142. ea in the image covered by the grid dots is calibrated accurately The calibration ROI output of the calibration function defines the region of the image in which the calibration information is accurate The calibration ROI in the perspective method encompasses the entire image The calibration ROI in the nonlinear method encompasses the bounding rectangle that encloses all the rectangular regions around the grid dots Figure 3 8 illustrates the calibration ROI concept National Instruments Corporation 3 11 IMAQ Vision Concepts Manual Chapter 3 System Setup and Calibration 1 Calibration ROI Using the Perspective Algorithm 2 Calibration ROI Using the Nonlinear Algorithm 3 Rectangular Region Surrounding Each Dot Figure 3 8 Calibrating ROIs 3 Note You can convert pixels that lie outside the calibration ROI to real world units but these conversions may be inaccurate IMAQ Vision Concepts Manual Calibration Quality Information The quality score and error map outputs of the calibration function indicate how well your system is calibrated The quality score which ranges from 0 to 1 000 reflects how well the calibration function learned the grid or set of points you provided The quality score does not reflect the accuracy of the calibration but rather describes how well the calibration mapping adapts to the learned grid or feature points Use the quality score to determine whether the
143. ect regions of interest zoom in and out and change the image palette A table associated with a logic operator that describes the rules used for that operation G 18 ni com value VI W watershed white reference level zones Glossary Volts The grayscale intensity of a color pixel computed as the average of the maximum and minimum red green and blue values of that pixel Virtual Instrument 1 A combination of hardware and or software elements typically used with a PC that has the functionality of a classic stand alone instrument 2 A LabVIEW software module VI which consists of a front panel user interface and a block diagram program A technique used to segment an image into multiple regions The level that defines what is white for a particular video system See also black reference level Areas in a displayed image that respond to user clicks You can use these zones to control events which can then be interpreted within LabVIEW National Instruments Corporation G 19 IMAQ Vision Concepts Manual Index Numerics 16 bit image display mapping methods for 2 2 A Absolute Difference operator table 6 2 Add operator table 6 2 advanced binary morphology functions See binary morphology AIPD National Instruments internal image file format 1 6 alignment application color pattern matching 14 19 edge detection 11 3 pattern matching 12 1 ambient lighting conditions color location
144. ectivity basic concepts and examples 9 7 Connectivity 4 9 8 Connectivity 8 9 9 in depth discussion 9 8 when to use 9 7 overview 9 1 pixel frame shape examples figures 9 4 hexagonal frame 9 6 overview 9 4 square frame 9 6 primary morphology functions auto median 9 21 erosion and dilation 9 10 hit miss 9 14 inner gradient 9 14 opening and closing 9 13 outer gradient 9 14 proper closing 9 20 proper opening 9 20 thickening 9 18 thinning 9 17 when to use 9 9 structuring elements basic concepts 9 2 pixel frame shape 9 4 size 9 2 values 9 3 when to use 9 1 Vision Concepts Manual Binary palette gray level values table 2 7 periodic palette figure 2 8 binary shape matching 12 9 bit depth image definition 1 2 bitmap BMP file format 1 5 blur and noise conditions color location tool 14 14 color pattern matching 14 21 pattern matching 12 4 BMP bitmap file format 1 5 border function binary morphology 9 22 borders See image borders Bounding Rect digital particles III 3 10 3 10 8 10 9 10 16 brightness definition 1 17 C calibration See spatial calibration center of mass III 2 III 3 8 3 10 5 10 7 10 15 chromaticity definition 1 17 CIE L a b color space overview 1 19 transforming RBG to CIE L a b 1 24 CIE XYZ color space overview 1 17 transforming RGB to CIE XYZ 1 21 circle detection functions in dimensional measurements 13 9 circle function binary morph
145. efined by the structuring element If I is the source image the proper opening function extracts the intersection between the source image and its transformed image obtained after an opening followed by a closing and then followed by another opening proper opening l AND OCOU or proper opening l AND I DEEDDE where Tis the source image E is an erosion D is a dilation O is an opening C is a closing F D is the image obtained after applying the function F to the image J and GF D is the image obtained after applying the function F to the image followed by the function G to the image 7 Proper Closing Function The proper closing function is a finite and dual combination of closings and openings It fills tiny holes and smooths the inner contour of particles according to the template defined by the structuring element If I is the source image the proper closing function extracts the union of the source image and its transformed image obtained after a closing followed by an opening and then followed by another closing proper closing I ORU COC or proper closing ORU EDDEED where Tis the source image E is an erosion D is a dilation 9 20 ni com Chapter 9 Binary Morphology O is an opening C is a closing F D is the image obtained after applying the function F to the image and GF J is the image obtained after applying the function F to the image followed by the function
146. egion of interest An area of the image that is graphically selected from a window displaying the image This area can be used to focus further processing Collection of tools that enable you to select a region of interest from an image These tools let you select points lines annuli polygons rectangles rotated rectangles ovals and freehand open and closed contours The amount by which one image is rotated with respect to a reference image This rotation is computed with respect to the center of the image A pattern matching technique in which the reference pattern can be located at any orientation in the test image as well as rotated at any degree G 16 ni com saturation scale invariant matching segmentation function separation function shape matching shift invariant matching Sigma filter skeleton function skiz function smoothing filter Sobel filter spatial calibration spatial filters spatial resolution O National Instruments Corporation G 17 Glossary Seconds The amount of white added to a pure color Saturation relates to the richness of a color A saturation of zero corresponds to a pure color with no white added Pink is a red with low saturation A pattern matching technique in which the reference pattern can be any size in the test image Fully partitions a labeled binary image into non overlapping segments with each segment containing a unique object Separates objects that
147. en 100 and 150 and its blue value should lie between 55 and 115 National Instruments Corporation 8 9 IMAQ Vision Concepts Manual Chapter 8 Thresholding IMAQ Vision Concepts Manual Red Plane Histogram 0 130 200 255 Green Plane Histogram 0 100 150 255 Blue Plane Histogram 0 55 115 255 Figure 8 1 Threshold Ranges for an RGB Image To threshold an RGB image first determine the red green and blue values of the pixels that constitute the objects you want to analyze after thresholding Then specify a threshold range for each color plane that encompasses the color values of interest You must choose correct ranges for all three color planes to isolate a color of interest Figure 8 2 shows the histograms of each plane of a color image stored in HSL format The gray shaded region indicates the threshold range for each of the color planes For a pixel in the color image to be set to 1 in the binary image its hue value should lie between 165 and 215 its saturation value should lie between 0 and 30 and its luminance value should lie between 25 and 210 8 10 ni com Chapter 8 Thresholding Hue Plane Histogram Saturation Plane Histogram Luminance Plane Histogram Figure 8 2 Threshold Ranges for an HSL Image o o The hue plane contains the main color information in an image To threshold an HSL image first determine the hue values of the pixels that you want to analyze after thresholdin
148. eneral applications gauging inspection and alignment Gauging Many gauging applications locate and then measure or gauge the distance between objects Searching and finding a feature is the key processing task that determines the success of many gauging applications If the components you want to gauge are uniquely identified by their color color pattern matching provides a fast way to locate the components 14 18 ni com Chapter 14 Color Inspection Inspection Inspection detects simple flaws such as missing parts or unreadable printing A common application is inspecting the labels on consumer product bottles for printing defects Because most of the labels are in color color pattern matching is used to locate the labels in the image before a detailed inspection of the label is performed The score returned by the color pattern matching tool can also be used to decide whether a label is acceptable Alignment Alignment determines the position and orientation of a known object by locating fiducials Use the fiducials as points of reference on the object Grayscale pattern matching is sufficient for most applications but some alignment applications require color pattern matching for more reliable results What to Expect from a Color Pattern Matching Tool In automated machine vision applications the visual appearance of materials or components under inspection can change due to factors such as orientation of the part scale ch
149. ent convolution kernels they combine the nonlinear Prewitt has the tendency to outline curved contours while the nonlinear Sobel extracts square contours This difference is noticeable when observing the outlines of isolated pixels Nonlinear Gradient Filter The nonlinear gradient filter outlines contours where an intensity variation occurs along the vertical axis 5 28 ni com Chapter 5 Image Processing Roberts Filter The Roberts filter outlines the contours that highlight pixels where an intensity variation occurs along the diagonal axes Differentiation Filter The differentiation filter produces continuous contours by highlighting each pixel where an intensity variation occurs between itself and its three upper left neighbors Sigma Filter The Sigma filter is a highpass filter It outlines contours and details by setting pixels to the mean value found in their neighborhood if their deviation from this value is not significant The example on the left shows an image before filtering The example on the right shows the image after filtering Lowpass Filter The lowpass filter reduces details and blurs edges by setting pixels to the mean value found in their neighborhood if their deviation from this value 1s large The example on the left shows an image before filtering The example on the right shows the image after filtering O National Instruments Corporation 5 29 IMAQ Vision Concepts Manual Chapter 5 Image P
150. ents as large as 7 x 7 without any modification If you plan to use structuring elements larger than 7 x 7 specify a correspondingly larger border when creating your image The size of the structuring element determines the speed of the morphological transformation The smaller the structuring element the faster the transformation Structuring Element Values The binary values of a structuring element determine which neighborhood pixels to consider during a transformation in the following manner Ifthe value of a structuring element sector is 1 the value of the corresponding source image pixel affects the central pixel s value during a transformation e If the value of a structuring element sector is 0 the morphological function disregards the value of the corresponding source image pixel Figure 9 2 illustrates the effect of structuring element values during a morphological function A morphological transformation using a structuring element alters a pixel Py so that it becomes a function of its neighboring pixel values Structuring Element Source Image Transform Image 0110 0 x 0 110 Neighbors used New Pp value to calculate the new P value Figure 9 2 Effect of Structuring Element Values on a Morphological Function National Instruments Corporation 9 3 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology P
151. environment and move and rotate ROIs Nondestructive overlays display important information on top of an image without changing the values of the image pixels When to Use Use display functions to visualize your image data retrieve generated events and the associated data from an image display environment select ROIs from an image interactively and annotate the image with additional information National Instruments Corporation 2 1 IMAQ Vision Concepts Manual Chapter 2 Display In Depth Discussion IMAQ Vision Concepts Manual This section describes the display modes available in IMAQ Vision and the 16 bit grayscale display mapping methods Display Modes One of the key components of displaying images is the display mode that the video adaptor operates The display mode indicates how many bits specify the color of a pixel on the display screen Generally the display mode available from a video adaptor ranges from 8 bits to 32 bits per pixel depending the amount of video memory available on the video adaptor and the screen resolution you choose If you have an 8 bit display mode a pixel can be one of 256 different colors If you have a 16 bit display mode a pixel can be one of 65 536 colors In 24 bit or 32 bit display mode the color of a pixel on the screen is encoded using 3 or 4 bytes respectively In these modes information is stored using 8 bits each for the red green and blue components of the pixel These modes
152. er level geometric interpretation of edges in the form of geometric objects such as circles and lines Figure 12 6 Edge Detection and Pattern Matching Techniques IMAQ Vision uses a combination of the edge information in the image and an intelligent image sampling technique to match patterns In cases where the pattern can be rotated in the image a similar technique is used but with specially chosen template pixels whose values or relative change in values reflect the rotation of the pattern The result is fast and accurate pattern matching IMAQ Vision pattern matching is able to accurately locate objects where they vary in size 5 and orientation between 0 and 360 and when their appearance is degraded In Depth Discussion This section provides additional information you may need for building successful pattern matching tools Cross Correlation The following is the basic concept of correlation Consider a subimage w x y of size K x L within an image f x y of size Mx N where K lt M and L lt N The correlation between w x y and f x y at a point x y is given by Da C ij Z venye iy j K 1 y 0 1 x 0 O National Instruments Corporation 12 7 IMAQ Vision Concepts Manual Chapter 12 Pattern Matching where 0 1 M 1 j 0 1 N 1 and the summation is taken over the region in the image where w and f overlap Figure 12 7 illustrates the correlation procedure Assume that the or
153. es Light drift is quantified by the difference between the average of the gray level of the left upper line and the right bottom line of the background of the barcode Decoding inaccuracies can occur if the light drift is greater than 120 for barcodes with two different widths of bars and spaces and greater than 100 for barcodes with four different widths of bars and spaces In overexposed images the gray levels of the wide and narrow bars in the barcode tend to differ Decoding results may not be accurate when the difference in gray levels is less than 80 for barcodes with two different widths of bars and spaces and less than 100 for barcodes with four different widths of bars and spaces Consider the difference in gray levels between the narrow bars and the wide bars The narrow bars are scarcely visible If this difference of gray level exceeds 115 on 8 bit images 256 gray levels for barcodes with two different widths of bars and spaces and 100 for barcodes with four different widths of bars and spaces the results may be inaccurate Noise is defined as the standard deviation of a rectangular region of interest drawn in the background It must be less than 57 for barcodes with two different widths of bars and spaces and less than 27 for barcodes with four different widths of bars and spaces Reflections on the barcode can introduce errors in the value read from the barcode Bars and spaces that are masked by the reflection produce
154. evel rather than a pixel level The presentation display of an image image data to the user Any process of acquiring and displaying images and analyzing image data Image Acquisition Inches Integral nonlinearity A measure in LSB of the worst case deviation from the ideal A D or D A transfer characteristic of the analog I O circuitry Finds the inner boundary of objects The process by which parts are tested for simple defects such as missing parts or cracks on part surfaces G 9 IMAQ Vision Concepts Manual Glossary inspection function instrument driver intensity intensity calibration intensity profile intensity range intensity threshold interpolation IRE JPEG IMAQ Vision Concepts Manual Analyzes groups of pixels within an image and returns information about the size shape position and pixel connectivity Typical applications include quality of parts analyzing defects locating objects and sorting objects A set of high level software functions such as NI IMAQ that control specific plug in computer boards Instrument drivers are available in several forms ranging from a function callable from a programming language to a virtual instrument VI in LabVIEW The sum of the red green and blue primary colors divided by three Red Green Blue 3 Assigning user defined quantities such as optical densities or concentrations to the gray level values in an image The gray level distrib
155. fication summary table 5 15 definition 5 14 linear filters Gaussian filters 5 25 gradient filter 5 15 in depth discussion 5 32 Laplacian filters 5 20 smoothing filter 5 23 IMAQ Vision Concepts Manual 1 16 nonlinear filters differentiation filter 5 29 gradient filter 5 28 in depth discussion 5 33 lowpass filter 5 29 median filter 5 30 Nth order filter 5 30 Prewitt filter 5 27 Roberts filter 5 29 Sigma filter 5 29 Sobel filter 5 27 when to use 5 14 spatial frequencies 7 1 spatial resolution of images 1 2 Spoke function 11 11 square pixel frame 9 6 standard representation FFT display 7 3 structuring elements basic concepts 9 2 dilation function effects table 9 12 erosion function effects table 9 12 pixel frame shape 9 4 size 9 2 values 9 3 when to use 9 1 Subtract operator table 6 2 sum measurements digital particles 10 17 support technical B 1 system integration services B 1 system setup See also spatial calibration acquiring quality images 3 3 basic concepts 3 1 contrast 3 5 depth of field 3 5 distortion 3 7 fundamental parameters figure 3 2 perspective 3 5 resolution 3 3 ni com T tagged image file format TIFF 1 5 technical support B 1 telephone technical support B 1 Temperature palette 2 6 thickening function binary morphology basic concepts 9 18 example 9 19 thinning function binary morphology basic concepts 9 17 example 9 17 thresholding
156. filter lowpass frequency filter lowpass truncation LSB L skeleton function luma luminance LUT m machine vision mask FFT filter match score IMAQ Vision Concepts Manual Compression in which the decompressed image is visually similar but not identical to the original image Applies a linear attenuation to the frequencies in an image with no attenuation at the lowest frequency and full attenuation at the highest frequency Removes or attenuates high frequencies present in the FFT domain of an image Attenuates intensity variations in an image You can use these filters to smooth an image by eliminating fine details and blurring edges Attenuates high frequencies present in the frequency domain of the image A lowpass frequency filter suppresses information related to fast variations of light intensities in the spatial image Removes all frequency information above a certain frequency Least significant bit Uses an L shaped structuring element in the skeleton function The brightness information in the video picture The luma signal amplitude varies in proportion to the brightness of the video signal and corresponds exactly to the monochrome picture See luma Lookup table Table containing values used to transform the gray level values of an image For each gray level value in the image the corresponding new value is obtained from the lookup table Meters An automated application that performs a set of v
157. g In some applications you may need to select colors with the same hue value but various saturation values Because the luminance plane contains only information about the intensity levels in the image you can set the luminance threshold range to include all the luminance values making the thresholding process independent from intensity information O National Instruments Corporation 8 11 IMAQ Vision Concepts Manual Binary Morphology Introduction This chapter contains information about element structuring connectivity and primary and advanced binary morphology operations Binary morphological operations extract and alter the structure of particles in a binary image You can use these operations during your inspection application to improve the information in a binary image before making particle measurements such as the area perimeter and orientation A binary image is an image containing particle regions with pixel values of and a background region with pixel values of 0 Binary images are the result of the thresholding process Because thresholding is a subjective process the resulting binary image may contain unwanted information such as noise particles particles touching the border of images particles touching each other and particles with uneven borders By affecting the shape of particles morphological functions can remove this unwanted information thus improving the information in the binary image Structu
158. g element on the right 1 1 1 extracts all pixels equal to O that are 1 0 1 surrounded by at least one layer of 1 1 1 pixels that are equal to 1 Use the hit miss function to locate pixels along a vertical left edge The structuring element on the right j 1 1 0 extracts pixels surrounded by at least 110 one layer of pixels equal to 1 to the 110 left and pixels that are equal to 0 to the right IMAQ Vision Concepts Manual 9 16 ni com Chapter 9 Binary Morphology Thinning Function The thinning function eliminates pixels that are located in a neighborhood matching a template specified by the structuring element Depending on the configuration of the structuring element you can also use thinning to remove single pixels isolated in the background and right angles along the edges of particles The larger the size of the structuring element the more specific the template can be The thinning function extracts the intersection between a source image and its transformed image after a hit miss function In binary terms the operation subtracts its hit miss transformation from a source image Do not use this function when the central coefficient of the structuring element is equal to 0 In such cases the hit miss function can change only the value of certain pixels in the background from 0 to 1 However the subtraction of the thinning function then resets these pixels back to 0 If Tis an image thinning 1 I hit miss
159. ge The complex numbers that compose the FFT plane are encoded in 64 bit floating point values 32 bits for the real part and 32 bits for the imaginary part IMAQ Vision Concepts Manual Glossary connectivity connectivity 4 connectivity 8 contrast convex hull convex hull function convolution convolution kernel cross correlation D Danielsson function dB default setting definition dendrite densitometry density function differentiation filter IMAQ Vision Concepts Manual Defines which of the surrounding pixels of a given pixel constitute its neighborhood Only pixels adjacent in the horizontal and vertical directions are considered neighbors All adjacent pixels are considered neighbors A constant multiplication factor applied to the luma and chroma components of a color pixel in the color decoding process The smallest convex polygon that can encapsulate a particle Computes the convex regions of objects in a binary image See linear filter 2D matrices or templates used to represent the filter in the filtering process The contents of these kernels are a discrete two dimensional representation of the impulse response of the filter that they represent The most common way to perform pattern matching Similar to the distance functions but with more accurate results Decibel The unit for expressing a logarithmic measure of the ratio of two signal levels dB 20log10 V1 V2 for signal
160. h table IMAQ Vision Concepts Manual See exponential function See logarithmic function Contains the low frequency information at the corners and high frequency information at the center of an FFT transformed image See mile A binary mask used in most morphological operations A structuring element is used to determine which neighboring pixels contribute in the operation Finds the location of the edge coordinates in terms of fractions of a pixel Color shape or pattern that you are trying to match in an image using the color matching shape matching or pattern matching functions A template can be a region selected from an image or it can be an entire image Alters the shape of objects by adding parts to the object that match the pattern specified in the structuring element Alters the shape of objects by eliminating parts of the object that match the pattern specified in the structuring element Separates objects from the background by assigning all pixels with intensities within a specified range to the object and the rest of the pixels to the background In the resulting binary image objects are represented with a pixel intensity of 255 and the background is set to 0 Two parameters the lower threshold gray level value and the upper threshold gray level value Tagged Image File Format Image format commonly used for encoding 8 bit 16 bit and color images extension TIF Collection of tools that enable you to sel
161. han 15 for 8 bit images 256 gray levels to obtain accurate results Each digit must be larger than 18 x 12 pixels to obtain accurate results The barcode concept applies to applications that require reading values encoded into 1D barcodes IMAQ Vision currently supports the following barcodes Code 25 Code 39 Code 93 Code 128 EAN 8 EAN 13 Codabar MSI and UPC A The process used to recognize the barcodes consists of two phases e A learning phase in which the user specifies an area of interest in the image which helps to localize the region occupied by the barcode The recognition phase during which the region specified by the user is analyzed to decode the barcode Barcode Algorithm Limits The following factors can cause errors in the decoding process e Very low resolution of the image e Very high horizontal or vertical light drift e The contrast along the bars of the image High level of noise O National Instruments Corporation 15 3 IMAQ Vision Concepts Manual Chapter 15 Instrument Readers IMAQ Vision Concepts Manual The limit conditions are different for barcodes that have two different widths of bars and spaces Code 39 Codabar Code 25 and MSI code and for barcodes that have four different widths of bars and spaces Code 93 Code 128 EAN 13 EAN 8 and UPC A The resolution of an image is determined by the width of the smallest bar and space These widths must be at least 3 pixels for all barcod
162. hapes such as circles ellipses and polygons that fit detected points For more 13 2 ni com Chapter 13 Dimensional Measurements information about the types of measurements you can make refer to your IMAQ Vision user manual Qualifying Measurements The last step of a gauging application involves determining the quality of the part based on the measurements obtained from the image You can determine the quality of the part using either relative comparisons or absolute comparisons In many applications the measurements obtained from the inspection image can be compared to the same measurements obtained from a standard specification or a reference image Because all the measurements are made on images of the part you can compare them directly In other applications the dimensional measurements obtained from the image must be compared with values that are specified in real units In this case convert the measurements from the image into real world units using the calibration tools described in Chapter 3 System Setup and Calibration Coordinate System In a typical machine vision application measurements are extracted from an ROI rather than from the entire image The object under inspection must always appear in the defined ROI in order to extract measurements from that ROI When the location and orientation of the object under inspection is always the same in the inspection images you can make measurements directly withou
163. harp changes in the intensity profile Edge detection identifies these changes Alignment Alignment determines the position and orientation of a part In many machine vision applications the object that you want to inspect may be at different locations in the image Edge detection finds the location of the object in the image before you perform the inspection so that you can inspect only the regions of interest The position and orientation of the part can also be used to provide feedback information to a positioning device such as a stage Figure shows an example of detecting the left boundary of a disk in the image You can use the location of the edges to determine the orientation of the disk O National Instruments Corporation 11 3 IMAQ Vision Concepts Manual Chapter 11 Edge Detection Figure 11 3 Alignment Using Edge Detection Edge Detection Concepts IMAQ Vision Concepts Manual Definition of an Edge An edge is a significant change in the grayscale values between adjacent pixels in an image In IMAQ Vision edge detection works on a one dimensional profile of pixel values along a search region as shown in Figure 11 4 The one dimensional search region can take the form of a line the perimeter of a circle or ellipse the boundary of a rectangle or polygon or a freehand region The software analyzes the pixel values along the profile to detect significant intensity changes You can specify characteristi
164. he linear histogram of the new image contains only the two peaks of the interval 50 190 Predefined Lookup Tables Seven predefined LUTs are available in IMAQ Vision Linear Logarithmic Power 1 Y Square Root Exponential Power Y and Square Table 5 1 shows the transfer function for each LUT and describes 1ts effect on an image displayed in a palette that associates dark colors to low intensity values and bright colors to high intensity values such as the Gray palette Table 5 1 LUT Transfer Functions Transfer LUT Function Shading Correction Linear Increases the intensity dynamic by evenly distributing a given gray level interval min max over the full gray scale 0 255 Min and max default values are 0 and 255 for an 8 bit image Logarithmic Power 1 Y Square Root Increases the brightness and contrast in dark regions Decreases the contrast in bright regions Power Y contrast in dark regions Increases Square the contrast in bright regions Exponential J Decreases the brightness and National Instruments Corporation 5 3 IMAQ Vision Concepts Manual Chapter 5 3 Image Processing Logarithmic and Inverse Gamma Correction The logarithmic and inverse gamma corrections expand low gray level ranges while compressing high gray level ranges When using the gray palette these transformations increase the overall brightness of an image and
165. he main axis and secondary axis in a single search area based on an edge detection algorithm First the function determines the main axis of the coordinate system as shown in Figure 13 2a A rake function finds the intersection pixels between multiple search lines and the edge of the National Instruments Corporation 13 5 IMAQ Vision Concepts Manual Chapter 13 Dimensional Measurements object You can specify the search direction along these search lines The intersection points are determined by their contrast width and steepness For more information about detecting edges refer to Chapter 11 Edge Detection A line fitted through the intersection points defines the main axis The function then searches for a secondary axis within the same search area as shown in Figure 13 2b The software uses multiple parallel lines that are parallel to the main axis to scan for edges and then fits a line through the edge of the object closest to the search area and perpendicular to the main axis This line defines the secondary axis of the coordinate system The secondary axis must not be parallel to the main axis The intersection between the main axis and secondary axis defines the origin of the reference coordinate system gt N 1 5 Yo a b 1 Search Area for the Coordinate System 3 Main Axis 5 Origin of the Refere
166. he position of the pattern in the image e Multiple instances of the pattern in the image if applicable Because color location only works on the color information of a region and does not use any kind of shape information from the template it does not find the angle of the rotation of the match It only locates the position of a region in the image whose size matches a template containing similar color information Refer to Figure 14 8 for an example of pattern orientation and multiple instances Figure 14 8a shows a template image Figures 14 8b and 14 8c show multiple shifted and rotated occurrences of the template Ambient Lighting Conditions The color location tool finds the reference pattern in an image under conditions of uniform changes in the lighting across the image Color location also finds patterns under conditions of non uniform light changes such as shadows Figure 14 10 illustrates typical conditions under which the color location tool works correctly Figure 14 10a shows the original template image Figure 14 10b shows the same pattern under bright light Figure 14 10c shows the pattern under poor lighting National Instruments Corporation 14 13 IMAQ Vision Concepts Manual Chapter 14 Color Inspection Figure 14 10 Examples of Lighting Conditions Blur and Noise Conditions Color location finds patterns that have undergone some transformation because of blurring o
167. hen analyzing particles similar to the one in Figure 10 2a For example if you threshold a cell with a dark nucleus Figure 10 2a so that the nucleus appears as a hole in the IMAQ Vision Concepts Manual Chapter 10 Particle Measurements cell Figure 10 2b you can make the following measurements about the cell Holes Area Returns the size of the nucleus e Particle amp Holes Area Returns the size of the entire cell e Holes Area Particle amp Holes Area Returns the percentage of the cell that the nucleus occupies Figure 10 2 Holes Measurements Particle Measurement Definitions The following tables include definitions symbols and equations for the particle measurements available in IMAQ Vision Sy Note Some equation symbols may be defined inside tables later in the chapter IMAQ Vision Concepts Manual 10 6 ni com Chapter 10 Particle Measurements Coordinates Table 10 2 lists the IMAQ Vision particle measurements relating to coordinates Table 10 2 Coordinates Measurement Definition Symbol Equation Center of Mass Point representing the average position of the total particle mass assuming every point in the particle has a constant density The center of mass can be located outside the particle if the particle 1s not convex First Pixel Highest leftmost particle pixel The first pixel is always given as a pixel measurement The black squares in
168. his case each level corresponds to a different color You easily can determine the relation of a set of pixels to the border of a particle The first layer which forms the border is red The second layer closest to the border is green the third layer is blue and so forth Circle Function The circle function allows you to separate overlapping circular particles The circle function uses the Danielsson coefficient to reconstitute the form of an particle provided that the particles are essentially circular The particles are treated as a set of overlapping discs that are then separated into separate discs This allows you to trace circles corresponding to each particle O National Instruments Corporation 9 29 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology Figure 9 29a illustrates the source image for the following example of the circle function Figure 9 29b illustrates the processed image Figure 9 29 Circle Function Convex Hull Function The convex hull function is useful for closing particles so that measurements can be made on the particle even though the particle contour is discontinuous The convex hull function calculates a convex envelope around each particle effectively closing the particle The image on which you use the convex hull function must be binary IMAQ Vision Concepts Manual 9 30 ni com Chapter 9 Binary Morphology Figure 9 30a represents the original binary labeled im
169. his method the threshold value is computed in such a way that the moments of the image to be thresholded are preserved in the output binary image The kth moment m of an image is calculated as i N 1 1 pe m y i h i i 0 where n is the total number of pixels in the image Color Thresholding Color thresholding converts a color image to a binary image To threshold a color image you must specify a threshold interval for each of the three color components A pixel in the output image is set if and only if its color components fall within the specified ranges Otherwise the pixel is set to 0 When to Use Threshold a color image when you need to isolate features for analysis and processing and remove features that do not interest you 3 Note Before performing a color threshold you may need to enhance your image with lookup tables or the equalize function To threshold a color image specify a threshold interval for each of the three color components The value of a pixel in the original image is set to 1 if and only if its color components fall within the specified range Otherwise the pixel value is set to 0 Figure 8 1 shows the histograms of each plane of a color image stored in RGB format The gray shaded region indicates the threshold range for each of the color planes For a pixel in the color image to be set to 1 in the binary image its red value should lie between 130 and 200 its green value should lie betwe
170. hold The following notations are used to describe the parameters of the histogram h i Class 0 Class 1 A oO 2 S E D Q 2 E k a Gray Level Value where i Represents the gray level value IMAQ Vision has five automatic thresholding techniques k Represents the gray level value chosen as the threshold h i Represents the number of pixels in the image at each gray level value N Represents the total number of gray levels in the image 256 for an 8 bit image n Total number of pixels in the image Clustering Entropy Inter Variance Metric Moments 8 6 ni com Chapter 8 Thresholding All of the five methods can be used to threshold an image into two classes The auto thresholding techniques are used to determine the threshold pixel value k such that all gray level values less than or equal to k belong to one class O and the other gray level values belong to another class 1 as shown in the figure above The clustering method is used when the image has to be thresholded into more than two classes Clustering The threshold value is the pixel value k for which the following condition is true Hy o k 2 where is the mean of all pixel values that lie between 0 and k and u is the mean of all the pixel values that lie between k 1 and 255 Entropy In this method the threshold value is obtained by applying information theory to the histogram data In information theo
171. hology description 5 37 examples 5 38 operators arithmetic 6 2 basic concepts 6 1 logic and comparison 6 2 when to use 6 1 optical representation FFT display 7 4 OR operator table 6 3 orientations digital particles Max Feret Diameter orientation 10 15 particle orientation 10 15 outer gradient function binary morphology 9 14 overlay See nondestructive overlay National Instruments Corporation Index P palettes basic concepts 2 4 Binary palette 2 7 definition 2 4 Gradient palette 2 7 Gray palette 2 5 Rainbow palette 2 6 Temperature palette 2 6 when to use 2 4 Particle amp Holes Area digital particles 10 13 particle analysis basic concepts III 2 parameters III 3 when to use III 2 particle hole definition 10 3 holes area 10 5 holes perimeter 10 9 particle measurements See digital particles particles definition III 1 pattern matching See also color pattern matching edge detection binary shape matching 12 9 coordinate system for dimensional measurements 13 7 cross correlation in depth discussion 12 7 overview 12 5 grayscale pattern matching combining color location and grayscale pattern matching 14 22 methods 14 22 new techniques 12 6 overview 12 1 pyramidal matching 12 5 scale invariant matching 12 5 shape matching 12 9 IMAQ Vision Concepts Manual Index traditional techniques 12 4 what to expect ambient lighting conditions 12 4 blur and noise co
172. ht changes such as in the presence of shadows than grayscale pattern matching Figure 14 15a shows the original template image Figure 14 15b shows the same pattern under bright light Figure 14 15c shows the pattern under poor lighting IMAQ Vision Concepts Manual Figure 14 15 Examples of Lighting Conditions 14 20 ni com Chapter 14 Color Inspection Blur and Noise Conditions Color pattern matching finds patterns that have undergone some transformation because of blurring or noise Blurring usually occurs because of incorrect focus or depth of field changes Color Pattern Matching Concepts Color pattern matching is a unique approach that combines color and spatial information to quickly find color patterns in an image It uses the technologies behind color matching and grayscale pattern matching in a synergistic way to locate color patterns in color images Color Matching and Color Location Color matching compares the color content of an image or regions in an image to existing color information The color information in the image may consist of one or more colors To use color matching define regions in an image that contain the color information you want to use as a reference The machine vision functions then learn the three dimensional color information in the image and represents it as a one dimensional color spectrum Your machine vision application compares the color information in the entire im
173. icle This measurement is Length always given as a pixel measurement Sum Moments of various orders relative to the x axis and y axis IMAQ Vision Concepts Manual 10 4 ni com Chapter 10 Particle Measurements Table 10 1 Particle Concepts Continued Concept Definition Moment of Inertia Moments about the particle center of mass Gives a representation of the pixel distribution in a particle with respect to the particle center of mass Moments of inertia are shift invariant Norm Moment of Inertia Moment of Inertia normalized with regard to the particle area Normalized moments of inertia are shift and scale invariant Hu Moment Moments derived from the Norm Moment of Inertia measurements Hu Moments are shift scale and rotation invariant National Instruments Corporation 10 5 Particle Holes A particle hole is a contiguous region of zero valued pixels completely surrounded by pixels with nonzero values A particle located inside a hole of a bigger particle is identified as a separate particle The area of a hole that contains a particle includes the area covered by the particle Particle 3 Particle 2 Particle 4 Particle 1 O A gt BC D E F G Particle amp Particle Area Holes Area Holes Area Particle 1 A B C A B C Particle 2 D 0 D Particle 3 E F G E F G Particle 4 G 0 G Holes measurements become valuable data w
174. igin of the image fis at the top left corner Correlation is the process of moving the template or subimage w around the image area and computing the value C in that area This involves multiplying each pixel in the template by the image pixel that it overlaps and then summing the results over all the pixels of the template The maximum value of C indicates the position where w best matches f Correlation values are not accurate at the borders of the image gt x e J N a A 0 0 i L gt gt i K M y w x y y xy Figure 12 7 Correlation Procedure IMAQ Vision Concepts Manual 12 8 ni com Chapter 12 Pattern Matching Basic correlation is very sensitive to amplitude changes in the image such as intensity and in the template For example if the intensity of the image fis doubled so are the values of c You can overcome sensitivity by computing the normalized correlation coefficient which is defined as L 1K 1 i gt w x y WU iy j Aj RED L 1K 1 y PL y gt Y Or y gt YD Pe iy j fi j x 0y 0 x 0y 0 where w calculated only once is the average intensity value of the pixels in the template w The variable fis the average value of fin the region coincident with the current location of w The value of R lies in the range 1 to 1 and is independent of scale changes in the intensity values of fand w Shape Matching When to Use Shape matchi
175. igure 11 6 Examples of Edges with Different Strengths Edge length Defines the maximum distance in which the desired grayscale difference between the edge and the background must occur The length characterizes the slope of the edge Use a longer length to detect edges with a gradual transition between the background and the edge e Edge polarity Describes if an edge is rising or falling A rising edge is characterized by an increase in grayscale values as you cross the edge A falling edge is characterized by a decrease in grayscale values as you cross the edge The polarity of an edge is linked to the search direction Figure 11 7 shows examples of edge polarities Edge position The x y location of an edge in the image Figure 11 5 shows the edge position for the edge model Falling Edge Negative Polarity Rising Edge Positive Polarity IMAQ Vision Concepts Manual Figure 11 7 Edge Polarity Edge Detection Methods IMAQ Vision provides two ways to perform edge detection Both methods compute the edge strength at each pixel along the one dimensional profile An edge occurs when the edge strength is greater than a minimum strength Additional checks find the correct location of the edge You can specify the minimum strength by using the contrast parameter in the software 11 6 ni com Chapter 11 Edge Detection Simple Edge Detection The software uses the pixel value at any point along the pi
176. ilation and erosion are morphological opposites Bright borders expanded by the dilation are reduced by the erosion However small dark particles that vanish during the dilation do not reappear after the erosion Opening and Closing Examples This example uses the following source image 5 38 ni com Chapter 5 Image Processing A closing function produces the following image 3 Note Consecutive applications of an opening or closing function always give the same results Proper Opening Function The gray level proper opening function is a finite and dual combination of openings and closings It removes bright pixels isolated in dark regions and smooths the boundaries of bright regions The effects of the function are moderated by the configuration of the structuring element Proper Closing Function The proper closing function is a finite and dual combination of closings and openings It removes dark pixels isolated in bright regions and smooths the boundaries of dark regions The effects of the function are moderated by the configuration of the structuring element Auto Median Function The auto median function uses dual combinations of openings and closings It generates simpler particles that have fewer details In Depth Discussion Erosion Concept and Mathematics Each pixel in an image becomes equal to the minimum value of its neighbors For a given pixel Po the structuring element is centered on Po The pixels mas
177. image Kernel Definition A gradient convolution filter is a first order derivative Its kernel uses the following model a b c b x d c d a where a b c and d are integers and x 0 or 1 5 16 ni com Chapter 5 Image Processing Filter Axis and Direction This kernel has an axis of symmetry that runs between the positive and negative coefficients of the kernel and through the central element This axis of symmetry gives the orientation of the edges to outline For example ifa 0 b 1 c 1 d 1 and x 0 the kernel is the following O 1 1 1 0 1 1 1 0 The axis of symmetry is located at 135 For a given direction you can design a gradient filter to highlight or darken the edges along that direction The filter actually is sensitive to the variations of intensity perpendicular to the axis of symmetry of its kernel Given the direction D going from the negative coefficients of the kernel towards the positive coefficients the filter highlights the pixels where the light intensity increases along the direction D and darkens the pixels where the light intensity decreases The following two kernels emphasize edges oriented at 135 Prewitt 10 highlights pixels where the light intensity increases along the direction going from northeast to southwest It darkens pixels where the light intensity decreases along that same direction This processing outlines the northeast front edges of bright regions su
178. image the first row last row first column or last column of pixels part of the kernel falls outside the image For example Figure 5 3 shows O National Instruments Corporation 5 11 IMAQ Vision Concepts Manual Chapter 5 Image Processing that one row and one column of a 3 x 3 kernel fall outside the image when computing the value of the topmost leftmost pixel IMAQ Vision Concepts Manual 5 12 ni com Chapter 5 Image Processing L 1 Border 2 Image 3 Kernel Figure 5 3 Filtering Border Pixels IMAQ Vision automatically allocates a border region when you create an image The default border region is three pixels deep and contains pixel values of 0 You also can define a custom border region and specify the pixel values within the region The size of the border region should be greater than or equal to half the number of rows or columns in your kernel The filtering results from along the border of an image are unreliable because the neighbors necessary to compute these values are missing therefore decreasing the efficiency of the filter which works on a much smaller number of pixels than specified for the rest of the image For more information about border regions refer to Chapter 1 Digital Images Spatial Filtering Filters are divided into two types linear also called convolution and nonlinear A convolution is an algorithm that consists of recalculat
179. imensional measurements 13 3 edge based functions 13 5 pattern matching based functions 13 7 steps for defining 13 4 when to use 13 3 spatial calibration axis direction figure 3 10 origin and angle figure 3 9 redefining 3 17 coordinates digital particles 10 7 correction region in calibration 3 15 cross correlation in pattern matching correlation procedure figure 12 8 in depth discussion 12 7 overview 12 5 cumulative histogram 4 3 customer education B 1 professional services B 1 technical support B 1 D Danielsson function 9 28 densitometry parameters 4 6 depth of field definition 3 3 setting 3 5 detection application See edge detection diagnostic resources B 1 differentiation filter definition 5 29 mathematical concepts 5 34 digital image processing definition 1 1 digital images color spaces CIE L a b color space 1 19 CMY color space 1 19 color sensations 1 14 ni com Index common types of color spaces 1 13 sums 10 17 definition 1 13 when to measure 10 1 HSL color space 1 17 dilation function RGB color space 1 15 binary morphology transformations basic concepts 9 11 RGB and CIE L a b 1 24 examples 9 11 RGB and CIE XYZ 1 21 structure element effects table 9 12 RGB and CMY 1 25 grayscale morphology RGB and HSL 1 20 concept and mathematics 5 40 RGB and YIQ 1 25 examples 5 36 RGB to grayscale 1 20 purpose and use 5 36 when to use 1 13 dimensional measurements
180. ine profile 4 5 image area digital particles 10 13 image borders definition 1 8 size of border 1 8 specifying pixel values 1 8 image correction in calibration 3 13 image definition bit depth 1 2 image display basic concepts 2 1 display modes 2 2 National Instruments Corporation 1 9 Index mapping methods for 16 bit image display 2 2 when to use 2 1 image files and formats 1 5 image masks applying with different offsets figure 1 12 definition 1 10 effect of mask figure 1 11 offset for limiting image mask 1 11 using image masks 1 11 when to use 1 10 image processing convolution kernels 5 10 definition 2 1 grayscale morphology functions auto median 5 39 basic concepts 5 36 closing 5 38 concepts and mathematics 5 39 dilation 5 36 erosion 5 36 opening 5 37 proper closing 5 39 proper opening 5 39 when to use 5 35 lookup table transformations 5 1 lookup tables basic concepts 5 1 Equalize 5 8 exponential and gamma correction 5 6 logarithmic and inverse gamma correction 5 4 predefined 5 3 when to use 5 1 spatial filters classification summary table 5 15 in depth discussion 5 31 linear filters 5 15 IMAQ Vision Concepts Manual Index nonlinear filters 5 27 when to use 5 14 image types color images 1 5 complex images 1 5 grayscale images 1 4 image understanding in pattern matching 12 6 image visualization definition 2 1 images See also digital images c
181. ing the value of a pixel based on its own pixel value and the pixel values of its neighbors weighted by the coefficients of a convolution kernel The sum of this calculation is divided by the sum of the elements in the kernel to obtain a new pixel value The size of the convolution kernel does not have a theoretical limit and can be either square or rectangular 3 x 3 5 x 5 5 x 7 9 x 3 127 x 127 and so on Convolutions are divided into four families gradient Laplacian smoothing and Gaussian This grouping is determined by the convolution kernel contents or the weight assigned to each pixel National Instruments Corporation 5 13 IMAQ Vision Concepts Manual Chapter 5 Image Processing When to Use IMAQ Vision Concepts Manual which depends on the geographical position of that pixel in relation to the central kernel pixel IMAQ Vision features a set of standard convolution kernels for each family and for the usual sizes 3 x 3 5 x 5 and 7 x 7 You also can create your own kernels and choose what to put into them The size of the user defined kernel is virtually unlimited With this capability you can create filters with specific characteristics Spatial filters serve a variety of purposes such as detecting edges along a specific direction contouring patterns reducing noise and detail outlining or smoothing Filters smooth sharpen transform and remove noise from an image so that you can extract the information you need
182. inite combination of successive opening and closing operations that you can use to remove small particles and smooth the boundaries of objects Points IMAQ Vision Concepts Manual Glossary pyramidal matching Q quantitative analysis R real time relative accuracy resolution reverse function RGB Roberts filter ROI ROI tools rotational shift rotation invariant matching IMAQ Vision Concepts Manual A technique used to increase the speed of a pattern matching algorithm by matching subsampled versions of the image and the reference pattern Obtaining various measurements of objects in an image A property of an event or system in which data is processed as it is acquired instead of being accumulated and processed at a later time A measure in LSB of the accuracy of an ADC it includes all nonlinearity and quantization errors but does not include offset and gain errors of the circuitry feeding the ADC The number of rows and columns of pixels An image composed of m rows and n columns has a resolution of m x n Inverts the pixel values in an image producing a photometric negative of the image Color encoding scheme using red green and blue RGB color information where each pixel in the color image is encoded using 32 bits 8 bits for red 8 bits for green 8 bits for blue and 8 bits for the alpha value unused Extracts the contours edge detection in gray level favoring diagonal edges R
183. ion in dimensional measurements calculation of mean square distance figure 13 15 ni com data set and fitted line figure 13 14 strongest line fit figure 13 16 line profile 4 5 line profile when to use 4 5 linear filters classes table 5 15 Gaussian filters example 5 26 kernel definition 5 26 predefined kernels A 7 gradient filters edge extraction and edge highlighting 5 18 edge thickness 5 19 example 5 16 filter axis and direction 5 17 kernel definition 5 16 in depth discussion 5 31 Laplacian filters contour extraction and highlighting 5 21 contour thickness 5 23 example 5 20 kernel definition 5 21 predefined kernels A 5 overview 5 15 smoothing filters example 5 24 kernel definition 5 24 predefined kernels A 6 linear histogram 4 3 logarithmic and inverse gamma correction basic concepts 5 4 examples 5 4 summary table 5 3 logic and comparison operators examples 6 5 list of operators table 6 3 purpose and use 6 2 National Instruments Corporation Index truth tables 6 4 using with binary image masks table 6 3 Logic Difference operator table 6 3 lookup table transformations basic concepts 5 1 examples 5 2 when to use 5 1 lookup tables Equalize 5 8 exponential and gamma correction 5 6 logarithmic and inverse gamma correction 5 4 predefined lookup tables 5 3 lowpass filters binary morphology basic concepts 9 23 effects table 9 23 example 9 24 c
184. ion of the image in which the software searches for the feature by specifying an ROI that encloses the feature Defining an ROI in which you expect to find the feature can prevent mismatches if the feature appears in multiple regions of the image A small ROI may also improve the locating speed Complete the following general steps to define a coordinate system and make measurements based on the new coordinate system 1 Define a reference coordinate system a Define a search area that encompasses the reference feature or features on which you base your coordinate system Make sure that the search area encompasses the features in all your inspection images b Locate an easy to find reference feature of the object under inspection That feature serves as the base for a reference coordinate system in a reference image You can use two main techniques to locate the feature edge detection or pattern matching The software builds a coordinate system to keep track of the location and orientation of the object in the image 2 Set up measurement areas within the reference image in which you want to make measurements Acquire an image of the object to inspect or measure 4 Update the coordinate system During this step IMAQ Vision locates the features in the search area and builds a new coordinate system based on the new location of the features 5 Make measurements within the updated measurement area IMAQ Vision computes the diffe
185. ion speed O National Instruments Corporation 14 1 IMAQ Vision Concepts Manual Chapter 14 Color Inspection IMAQ Vision Concepts Manual White Luminance Black Figure 14 1 HSL Color Space Colors represented in the HSL model space are easy for humans to quantify The luminance intensity component in the HSL space is separated from the color information This feature leads to a more robust color representation independent of light intensity variation However the chromaticity hue and saturation plane cannot be used to represent the black and white colors that often are the background colors in many machine vision applications Refer to Chapter 1 Digital Images for more information about color spaces 14 2 ni com Chapter 14 Color Inspection Generating the Color Spectrum Each element in the color spectrum array corresponds to a bin of colors in the HSL space The last two elements of the array represent black and white colors respectively Figure 14 2 illustrates how the HSL color space is divided into bins The hue space is divided into a number of equal sectors and each sector is further divided into two parts representing a part with high saturation values and another part with low saturation values Each of these parts corresponds to a color bin an element in the color spectrum array 1 Sector 2 Saturation Threshold 3 Color Bins Figure 14 2 The HSL Space Divided into Bin
186. ion to consider pixels to be part of the same particle even if the pixels touch only at a corner Connectivity Concepts With connectivity 4 two pixels are considered part of the same particle if they are horizontally or vertically adjacent With connectivity 8 two pixels are considered part of the same particle if they are horizontally vertically or diagonally adjacent Figure 9 8 illustrates the two types of connectivity Connectivity 4 Connectivity 8 Figure 9 8 Connectivity Types Figure 9 9 illustrates how connectivity 4 and connectivity 8 affect the way the number of particles in an image are determined In Figure 9 9a the National Instruments Corporation 9 7 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology In Depth Discussio IMAQ Vision Concepts Manual image has two particles with connectivity 4 In Figure 9 9b the same image has one particle with connectivity 8 Figure 9 9 Example of Connectivity Processing In a rectangular pixel frame each pixel Po has eight neighbors as shown in the following graphic From a mathematical point of view the pixels P P3 Ps and P are closer to Py than the pixels P P4 Pg and Pg Pg Py P3 P Po Py P P P If Dis the distance from Py to P4 then the distances between Pp and its eight neighbors can range from D to 2D as shown in the following graphic JD D J2D D 0 D JD
187. ions they affect only the display The overlay is associated with an image so there are no special overlay data types You need only to add the overlay to your image IMAQ Vision clears the overlay anytime you change the size or orientation of the image because the overlay ceases to have meaning You can save overlays with images using the PNG file format O National Instruments Corporation 2 11 IMAQ Vision Concepts Manual System Setup and Calibration This chapter describes how to set up an imaging system and calibrate the imaging setup so that you can convert pixel coordinates to real world coordinates Converting pixel coordinates to real world coordinates is useful when you need to make accurate measurements from inspection images using real world units Setting Up Your Imaging System Before you acquire analyze and process images you must set up your imaging system Five factors comprise a imaging system field of view working distance resolution depth of field and sensor size Figure 3 1 illustrates these concepts National Instruments Corporation 3 1 IMAQ Vision Concepts Manual Chapter 3 System Setup and Calibration 1 Resolution 3 Working Distance 5 Depth of Field 7 Pixel 2 Field of View 4 Sensor Size 6 Image 8 Pixel Resolution Figure 3 1 Fundamental Parameters of an Imaging System IMAQ Vision Concepts Manual 3 2 ni com Chapter 3 System Setup and Calibration e Resolution
188. ipulate color information such as computer graphics and image processing Color CRT monitors the majority of color video cameras and most computer 1 14 ni com Chapter 1 Digital Images graphics systems use the RGB color space The HSL space combined with RGB and YIQ is frequently used in applications that manipulate color such as image processing The color picture publishing industry uses the CMY color space also known as CMYK The YIQ space is the standard for color TV broadcast The RGB Color Space The RGB color space is the most commonly used color space The human eye receives color information in separate red green and blue components through cones the color receptors present in the human eye These three colors are known as additive primary colors In an additive color system the human brain processes the three primary light sources and combines them to compose a single color image The three primary color components can combine to reproduce almost all other possible colors You can visualize the RGB space as a 3 dimensional cube with red green and blue at the corners of each axis as shown in Figure 1 7 Black is at the origin while white is at the opposite corner of the cube Each side of the cube has a value between 0 and 1 Along each axis of the RGB cube the colors range from no contribution of that component to a fully saturated color Any point or color within the cube is specified by three numbers an R
189. is publication may not be reproduced or transmitted in any form electronic or mechanical including photocopying recording storing in an information retrieval system or translating in whole or in part without the prior written consent of National Instruments Corporation Trademarks CVI IMAQ LabVIEW Measurement Studio National Instruments NI ni com and NI IMAQ are trademarks of National Instruments Corporation Product and company names mentioned herein are trademarks or trade names of their respective companies Patents For patents covering National Instruments products refer to the appropriate location Help Patents in your software the patents txt file on your CD or ni com patents WARNING REGARDING USE OF NATIONAL INSTRUMENTS PRODUCTS 1 NATIONAL INSTRUMENTS PRODUCTS ARE NOT DESIGNED WITH COMPONENTS AND TESTING FOR A LEVEL OF RELIABILITY SUITABLE FOR USE IN OR IN CONNECTION WITH SURGICAL IMPLANTS OR AS CRITICAL COMPONENTS IN ANY LIFE SUPPORT SYSTEMS WHOSE FAILURE TO PERFORM CAN REASONABLY BE EXPECTED TO CAUSE SIGNIFICANT INJURY TO A HUMAN 2 IN ANY APPLICATION INCLUDING THE ABOVE RELIABILITY OF OPERATION OF THE SOFTWARE PRODUCTS CAN BE IMPAIRED BY ADVERSE FACTORS INCLUDING BUT NOT LIMITED TO FLUCTUATIONS IN ELECTRICAL POWER SUPPLY COMPUTER HARDWARE MALFUNCTIONS COMPUTER OPERATING SYSTEM SOFTWARE FITNESS FITNESS OF COMPILERS AND DEVELOPMENT SOFTWARE USED TO DEVELOP AN APPLICATION INSTALLATI
190. istograms line profiles and intensity measurements Chapter 5 Image Processing contains information about lookup tables kernels spatial filtering and grayscale morphology Chapter 6 Operators contains information about arithmetic and logic operators that mask combine and compare images Chapter 7 Frequency Domain Analysis contains information about frequency domain analysis the Fast Fourier transform and analyzing and processing images in the frequency domain National Instruments Corporation 11 1 IMAQ Vision Concepts Manual Image Analysis Histogram This chapter contains information about histograms line profiles and intensity measurements Image analysis combines techniques that compute statistics and measurements based on the gray level intensities of the image pixels You can use the image analysis functions to understand the content of the image and to decide which type of inspection tools to use to solve your application Image analysis functions also provide measurements that you can use to perform basic inspection tasks such as presence or absence verification When to Use A histogram counts and graphs the total number of pixels at each grayscale level From the graph you can tell whether the image contains distinct regions of a certain gray level value A histogram provides a general description of the appearance of an image and helps identify various components such as the background objec
191. isual inspection tasks Removes frequencies contained in a mask range specified by the user A number ranging from 0 to 1000 that indicates how closely an acquired image matches the template image A match score of 1000 indicates a perfect match A match score of 0 indicates no match G 12 ni com median filter memory buffer mile MMX morphological transformations MSB M skeleton function nautical mile neighbor neighborhood Operations NI IMAQ nonlinear filter nonlinear gradient filter National Instruments Corporation G 13 Glossary A lowpass filter that assigns to each pixel the median value of its neighbors This filter effectively removes isolated pixels without blurring the contours of objects See buffer An Imperial unit of length equal to 5 280 feet or 1 609 344 meters Also known as a Statute mile to discern from a nautical mile See also nautical mile Multimedia Extensions Intel chip based technology that allows parallel operations on integers which results in accelerated processing of 8 bit images Extract and alter the structure of objects in an image You can use these transformations for expanding dilating or reducing eroding objects filling holes closing inclusions or smoothing borders They are used primarily to delineate objects and prepare them for quantitative inspection analysis Most significant bit Uses an M shaped structuring element in the skeleton function
192. ive assemblies An example of a color identification task is to ensure that the color of the fabric in the interior of a car adheres to specifications Color Inspection Color inspection detects simple flaws such as missing or misplaced color components defects on the surfaces of color objects or printing errors on color labels You can use color matching for these applications if known regions of interest predefine the object or areas to be inspected in the image You can define these regions or they can be the output of some other machine vision tool such as pattern matching used to locate the components to be inspected The layout of the fuses in junction boxes in automotive assemblies is easily defined by regions of interest Color matching determines if all of the fuses are present and in the correct locations Figure 14 6 shows an example of a fuse box inspection application in which the exact location of the fuses in the image can be specified by regions of interest Color matching compares National Instruments Corporation 14 7 IMAQ Vision Concepts Manual Chapter 14 Color Inspection the color of the fuse in each region to the color that is expected to be in that region a D 1 Score 51 4 Score 649 7 Score 1000 10 Score 8 2 Score 382 5 Score 29 8 Score 667 11 Inspection Regions 3 Score 23 6 Score 70 9 Score 990 Figure 14 6 Fuse Box Inspection Using Color Matching Color matching can be
193. ixel Frame Shape A digital image is a 2D array of pixels arranged in a rectangular grid Morphological transformations that extract and alter the structure of particles allow you to process pixels in either a square or hexagonal configuration These pixel configurations introduce the concept of a pixel frame Pixel frames can either be aligned square or shifted hexagonal The pixel frame parameter is important for functions that alter the value of pixels according to the intensity values of their neighbors Your decision to use a square or hexagonal frame affects how IMAQ Vision analyzes the image when you process it with functions that use this frame concept IMAQ Vision uses the square frame by default Sy Note Pixels in the image do not physically shift in a horizontal pixel frame Functions that allow you to set the pixel frame shape merely process the pixel values differently when you specify a hexagonal frame Figure 9 3 illustrates the difference between a square and hexagonal pixel frame when a 3 x 3 and a5 x 5 structuring element are applied e 1 Square3x3 3 Square5 X5 2 Hexagonal 3 x 3 4 Hexagonal 5 x5 Figure 9 3 Square and Hexagonal Pixel Frames IMAQ Vision Concepts Manual 9 4 ni com Chapter 9 Binary Morphology If a morphological function uses a 3 x 3 s
194. ked by a coefficient of the structuring element equal to 1 are then referred as P P min P O National Instruments Corporation 5 39 IMAQ Vision Concepts Manual Chapter 5 Image Processing B Note A gray level erosion using a structuring element f x f with all its coefficients set to 1 is equivalent to an Nth order filter with a filter size f x f and the value N equal to 0 Refer to the Nonlinear Filters section for more information Dilation Concept and Mathematics Each pixel in an image becomes equal to the maximum value of its neighbors For a given pixel Po the structuring element is centered on Po The pixels masked by a coefficient of the structuring element equal to are then referred as P Po max P ay Note A gray level dilation using a structuring element f x f with all its coefficients set to 1 is equivalent to an Nth order filter with a filter size f x f and the value N equal to f 1 refer to the nonlinear spatial filters Refer to the Nonlinear Filters section for more information IMAQ Vision Concepts Manual Proper Opening Concept and Mathematics If Tis the source image the proper opening function extracts the minimum value of each pixel between the source image Z and its transformed image obtained after an opening followed by a closing and followed by another opening proper opening l min I OCO D or proper opening I min I DEEDDE D where Tis the source image E is an erosion
195. king that all the pills have the same color information Because your task is to determine if there are a fixed number of the correct pills in the pack color location is a very effective tool 14 10 ni com Chapter 14 Color Inspection Figure 14 8a shows the template image of the part of the pill that contains the color information that you want to locate Figure 14 8b shows the pills located in a good blister pack Figure 14 8c shows the pills located when a blister pack contains the wrong type of pills or missing pills Because the exact locations of the pills is not necessary for the inspection the number of matches returned by color location indicates whether a blister pack passes inspection Figure 14 8 Blister Pack Inspection Using Color Matching Identification Identification assigns a label to an object based on its features In many applications the color coded identification marks are placed on the objects In these applications color matching locates the color code and identifies the object In a spring identification application different types of springs are identified by a collection of color marks painted on the coil If you know the different types of color patches that are used to mark the springs color location can find which color marks appear in the image You then can use this information to identify the type of spring Sorting Sorting separates objects based on attributes such as color size and
196. ks like the source image with all significant variations of the light intensity highlighted Source Image Laplacian 4 Filtered Image 1 1 1 1 9 1 1 1 1 Notice that the Laplacian 4 kernel can be decomposed as follows 1 1 l 1 1 l 0 0 0 1 9 1 1 8 1 O 1 0 1 1 1 1 1 1 0 0 0 3 Note The convolution filter using the second kernel on the right side of the equation reproduces the source image All neighboring pixels are multiplied by 0 and the central pixel remains equal to itself Pa p 1 x Pg p This equation indicates that the Laplacian 2 kernel adds the contours extracted by the Laplacian 1 kernel to the source image Laplacian 4 Laplacian 3 Source Image IMAQ Vision Concepts Manual 5 22 ni com Chapter 5 Image Processing For example if the central coefficient of Laplacian 4 kernel is 10 the Laplacian filter adds the contours extracted by Laplacian 3 kernel to the source image times 2 and so forth A greater central coefficient corresponds to less prominent contours and details highlighted by the filter Contour Thickness Larger kernels correspond to thicker contours The following image is a Laplacian 3 x 3 The following image is a Laplacian 5 x 5 Smoothing Filter A smoothing filter attenuates the variations of light intensity in the neighborhood of a pixel It smooths the overall shape of objects blurs edges and removes details National
197. l and internal boundaries of particles and locating particular configurations of pixels National Instruments Corporation 9 9 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology You can also use these transformations to prepare particles for quantitative analysis to observe the geometry of regions and to extract the simplest forms for modeling and identification purposes Primary Morphology Concepts The primary morphology functions apply to binary images in which particles have been set to 1 and the background is equal to 0 They include three fundamental binary processing functions erosion dilation and hit miss The other transformations are combinations of these three functions This section describes the following primary morphology transformations e Erosion e Dilation e Opening e Closing e Inner gradient e Outer gradient e Hit miss e Thinning e Thickening e Proper opening e Proper closing Auto median Sy Note In the following descriptions the term pixel denotes a pixel equal to 1 and the term particle denotes a group of pixels equal to 1 Erosion and Dilation Functions An erosion eliminates pixels isolated in the background and erodes the contour of particles according to the template defined by the structuring element IMAQ Vision Concepts Manual 9 10 ni com Chapter 9 Binary Morphology For a given pixel Po the structuring element is centered on Po The pixels masked by a coefficient
198. lasses table 5 15 definition 5 14 nonlinear basic concepts 5 29 mathematical concepts 5 34 lowpass frequency FET filters attenuation 7 6 examples 7 7 overview 7 2 truncation 7 6 L skeleton function 9 25 LUTs See lookup tables manual See documentation mapping methods for 16 bit image display 2 2 mask FFT filters overview 7 2 purpose and use 7 10 Mask operator table 6 3 IMAQ Vision Concepts Manual Index masks See image masks structuring elements Max Feret Diameter digital particles 10 4 max horizontal segment length digital particles 10 8 Max operator table 6 3 Mean operator table 6 3 median filter basic concepts 5 30 mathematical concepts 5 34 meter functions algorithm limits 15 2 purpose and use 15 1 metric technique in automatic thresholding in depth discussion 8 8 overview 8 5 Min operator table 6 3 Modulo operator table 6 2 moments technique in automatic thresholding in depth discussion 8 9 overview 8 5 moments digital particles 10 18 morphology functions See binary morphology grayscale morphology functions M skeleton function 9 26 multiple instances of patterns See pattern orientation and multiple instances Multiply operator table 6 2 NAND operator table 6 3 National Instruments customer education B 1 professional services B 1 system integration services B 1 technical support B 1 worldwide offices B 1 National Instruments internal image fil
199. lement with the highest value in the color spectrum You also can use the array of the color spectrum to directly analyze the color distribution and for color matching applications 7 NATIONAL INSTRUMENTS Ths Sonore it the Lester Figure 14 4 Color Spectrum Associated with an Image Color Matching Color matching quantifies which colors and how much of each color exist in a region of an image and uses this information to check if another image contains the same colors in the same ratio Use color matching to compare the color content of an image or regions within an image to a reference color information With color matching you create an image or select regions in an image that contain the color information you want to use as a reference The color information in the image may consist of one or more colors The machine vision software then learns the three dimensional color information in the image and represents it as a one dimensional color spectrum Your machine vision application compares the color information in the entire image or regions in the image to the learned color spectrum calculating a score for each region The score relates how closely the color information in the image region matches the information represented by the color spectrum National Instruments Corporation 14 5 IMAQ Vision Concepts Manual Chapter 14 Color Inspection When to Use IMAQ Vision Concepts Manual Col
200. ler than the error value 95 of the time A pixel coordinate with a small error value indicates that its estimated real world coordinate is computed very accurately A large error value indicates that the estimated real world coordinate for a pixel may not be accurate Use the error map to determine whether your imaging setup and calibration information satisfy the accuracy requirements of your inspection application If the error values are greater than the positional errors that your application can tolerate you need to improve your imaging setup An imaging system with high lens distortion usually results in an error map with high values If you are using a lens with considerable distortion you can use the error map to determine the position of the pixels that satisfy your application s accuracy requirements Because the effect of lens distortion increases toward the image borders pixels close to the center of the image have lower error values than the pixels at the image borders Image Correction Image correction involves transforming a distorted image acquired in a calibrated setup into an image where perspective errors and lens distortion are corrected IMAQ Vision corrects an image by applying the transformation from pixel to real world coordinates for each pixel in the input image Then IMAQ Vision applies simple shift and scaling transformations to position the real world coordinates into a new image IMAQ Vision uses interpolation d
201. ls in the image mask determine whether corresponding pixels in the inspection image are processed If a pixel in the image mask has a nonzero value the corresponding pixel in the inspection image is processed If a pixel in the image mask has a value of 0 the corresponding pixel in the inspection image is not processed Use image masks when you want to focus your processing or inspection on particular regions in the image Pixels in the source image are processed if corresponding pixels in the image mask have values other than zero Figure 1 4 shows how a mask affects the output of the function that inverts the pixel values in an image Figure 1 4a shows the inspection image Figure 1 4b shows the image mask Pixels in the mask with zero values are represented in black and pixels with nonzero values are represented in white Figure 1 4c shows the 1 10 ni com Chapter 1 Digital Images inverse of the inspection image using the image mask Figure 1 4d shows the inverse of the inspection image without the image mask LS Figure 1 4 The Effect of an Image Mask You can limit the area in which your function applies an image mask to the bounding rectangle of the region you want to process This technique saves memory by limiting the image mask to only the part of the image containing significant information To keep track of the location of this region of interest ROT in regard to the original image IMAQ Vision sets an offse
202. m O National Instruments Corporation 14 15 IMAQ Vision Concepts Manual Chapter 14 Color Inspection Template y Learn color information in the template Template Descriptor Uses a coarse to fine search strategy to find Image a list of possible matches with scores Learning Phase Matching Phase Initial Match List Refine each match location using a hill climbing process and update scores Figure 14 11 Overview of the Color Location Process Color Pattern Matching When to Use IMAQ Vision Concepts Manual Use color pattern matching to quickly locate known reference patterns or fiducials in a color image With color pattern matching you create a model or template that represents the object you are searching for Then your machine vision application searches for the model in each acquired image calculating a score for each match The score indicates how closely the model matches the color pattern found Use color pattern matching to locate reference patterns that are fully described by the color and spatial information in the pattern Grayscale or monochrome pattern matching is a well established tool for alignment gauging and inspection applications Refer to Chapter 12 Pattern Matching for more information about pattern matching In all of these application areas color simplifies a monochrome problem by improving contrast or separation of the obje
203. m Figure 13 4b shows an inspection image with an updated coordinate system CO e lt e lt MI LO Mi ii 1 Located Feature 2 Coordinate System 3 Measurement Area IMAQ Vision Concepts Manual Figure 13 4 Locating a Coordinate System with Shift Invariant Pattern Matching Figure 13 5 illustrates how to locate a coordinate system using a rotation invariant pattern matching strategy Figure 13 5a shows a reference image with a defined reference coordinate system Figure 13 5b shows an inspection image with an updated coordinate system 13 8 ni com Chapter 13 Dimensional Measurements 1 Located Feature 2 Coordinate System 3 Origin of the Coordinate System 4 Measurement Areas Figure 13 5 Locating a Coordinate System with Rotation Invariant Pattern Matching Finding Features or Measurement Points Before making measurements you must locate features that you can use to make the measurements There are many ways to find these features on an image The most common features used to make measurements are points along the boundary of the part you want to gauge Edge Based Features Use the edge detection techniques described in Chapter 11 Edge Detection to find edge points along a single search contour or along multiple search contours defined inside a 2D search area Line and Circular Features Use the line detection functions in I
204. mages Concepts To calibrate an imaging setup the calibration software uses a set of known mappings between points in the image and their corresponding locations in the real world The calibration software uses these known mappings to compute the pixel to real world mapping for the entire image The resulting calibration information is valid only for the imaging setup that you used to create the mapping Any change in the imaging setup that violates the mapping information compromises the accuracy of the calibration information Calibration Process The calibration software requires a list of known pixel to real world mappings to compute calibration information for the entire image You can specify the list in two ways Image a grid of dots similar to the one shown in Figure 3 5a Input the dx and dy spacing between the dots in real world units The calibration software uses the image of the grid shown in Figure 3 5b and the spacing between the dots in the grid to generate the list of pixel to real world mappings required for the calibration process e Input a list of real world points and the corresponding pixel coordinates directly to the calibration software Figure 3 5 Calibration Setup IMAQ Vision Concepts Manual 3 8 ni com Chapter 3 System Setup and Calibration The calibration process uses the list of pixel to real world mappings and a user defined algorithm to create a mapping for the entire
205. mponent from the color information and the close relationship between chromacity and human perception of color make the HSL space ideal for developing machine vision applications CIE XYZ Color Space The CIE color space system classifies colors according to the human vision system This system specifies colors in CIE coordinates and is a standard for comparing one color in the CIE coordinates with another Visible light is electromagnetic energy that occupies approximately the 400 nm to 700 nm wavelength part of the spectrum Humans perceive these wavelengths as the colors violet through indigo blue green yellow orange and red Figure 1 8 shows the amounts of red green and blue light needed by an average observer to match a color of constant luminance for all values of dominant wavelengths in the visible spectrum The dominant wavelength is the wavelength of the color humans see when viewing the light The negative values between 438 1 nm and 546 1 nm indicate that all National Instruments Corporation 1 17 IMAQ Vision Concepts Manual Chapter 1 Digital Images visible colors cannot be specified by adding together the three positive primaries R G and B in the RGB color space A 0 4 E Y ba A wn S 0 2 kh S wn 2 jun o E 0 i E E c Cc Cc o os Ko r 9p Y o 0 2 gt 400 500 600 700 Wavelength 1 nm Figure 1 8 Visible Light In 1931 the CIE developed a sys
206. n between 0 and 360 and when their appearance is degraded Refer to Chapter 12 Pattern Matching for more information on grayscale pattern matching Combining Color Location and Grayscale Pattern Matching Color pattern matching uses a combination of color location and grayscale pattern matching to search for the template When you use color pattern matching to search for a template the software uses the color information in the template to look for occurrences of the template in the image The software then applies grayscale pattern matching in a region around each of these occurrences to find the exact position of the template in the image Figure 14 16 shows the general flow of the color pattern matching algorithm The size of the searchable region is determined by the software based on the inputs you provide such as search strategy and color sensitivity 14 22 ni com Chapter 14 Color Inspection Learning Phase and information for Matching Phase Template Learn color information grayscale pattern matching Template Descriptor Use the first part of the color location algorithm to find instances of the template in the image lt A Image Match locations based on color Search a region around each color match using grayscale pattern matching to obtain final locations Score each match according to color and grayscale information
207. n send email to techpubs ni com O 2000 2003 National Instruments Corporation All rights reserved Important Information Warranty The media on which you receive National Instruments software are warranted not to fail to execute programming instructions due to defects in materials and workmanship for a period of 90 days from date of shipment as evidenced by receipts or other documentation National Instruments will at its option repair or replace software media that do not execute programming instructions if National Instruments receives notice of such defects during the warranty period National Instruments does not warrant that the operation of the software shall be uninterrupted or error free A Return Material Authorization RMA number must be obtained from the factory and clearly marked on the outside of the package before any equipment will be accepted for warranty work National Instruments will pay the shipping costs of returning to the owner parts which are covered by warranty National Instruments believes that the information in this document is accurate The document has been carefully reviewed for technical accuracy In the event that technical or typographical errors exist National Instruments reserves the right to make changes to subsequent editions of this document without prior notice to holders of this edition The reader should consult National Instruments if errors are suspected In no event shall National Instrument
208. n of pixels surrounding an image Functions that process pixels based on the value of the pixel neighbors require image borders An image that contains thumbnails of images to analyze or process in a vision application G 8 ni com image buffer image definition image enhancement image file image format image mask image palette image processing image source image understanding image visualization imaging IMAQ in INL inner gradient inspection O National Instruments Corporation Glossary Memory location used to store images See pixel depth The process of improving the quality of an image that you acquire from a sensor in terms of signal to noise ratio image contrast edge definition and so on A file containing pixel data and additional information about the image Defines how an image is stored in a file Usually composed of a header followed by the pixel data A binary image that isolates parts of a source image for further processing A pixel in the source image is processed if its corresponding mask pixel has a nonzero value A source pixel whose corresponding mask pixel has a value of 0 is left unchanged The gradation of colors used to display an image on screen usually defined by a color lookup table Encompasses various processes and analysis functions that you can apply to an image Original input image A technique that interprets the content of the image at a symbolic l
209. nce 2 Search Lines 4 Secondary Axis Coordinate System IMAQ Vision Concepts Manual Figure 13 2 Locating a Coordinate System with One Search Area Two Search Areas This method uses the same operating mode as the single search area method However the two edges used to define the coordinate system axes are located in two distinct search areas The function first determines the position of the main axis of the coordinate system It locates the intersection points between a set of parallel search lines in the primary search area and a distinct straight edge of the object The intersection points are determined based on their contrast width and steepness For more information about detecting edges refer to Chapter 11 Edge Detection A line fitted through the intersection points defines the primary axis The process is repeated perpendicularly in the secondary 13 6 ni com Chapter 13 Dimensional Measurements search area to locate the secondary axis The intersection between the primary axis and secondary axis is the origin of the coordinate system Figure 13 3a shows a reference image with a defined reference coordinate system Figure 13 3b shows an inspection image with an updated coordinate system 1 Primary Search Area 2 Secondary Search Area 3 Origin of the Coordinate System 4 Measurement Area Figure 13 3 Locating a Coordinate System with Two Search Areas Pattern Matching Based Coordinate System
210. nd 255 are special cases A value of O results in black and a value of 255 results in white This periodic palette is appropriate for the display of binary and labeled images Red Green Blue Regions of Interest When to Use A region of interest ROT is an area of an image in which you want to perform your image analysis Use ROIs to focus your processing and analysis on part of an image You can define an ROI using standard contours such as an oval or rectangle or freehand contours You also can perform any of the following options e Construct an ROI in an image display environment e Associate an ROI with an image display environment e Extract an ROI associated with an image display environment e Erase the current ROI from an image display environment IMAQ Vision Concepts Manual 2 8 ni com Chapter 2 Display e Transform an ROI into an image mask e Transform an image mask into an ROI ROI Concepts An ROI describes a region or multiple regions of an image in which you want to focus your processing and analysis These regions are defined by specific contours IMAQ Vision supports the following contour types Table 2 2 Types of Contours an ROI May Contain Icon Contour Name Point Line Rectangle Rotated Rectangle Oval Annulus Broken Line Polygon Freehand Line Freehand Region Oo S U PO Ee J
211. nditions 12 4 pattern orientation and multiple instances 12 3 when to use 12 1 pattern orientation and multiple instances color location tool 14 13 color pattern matching 14 19 pattern matching 12 3 percent measurements digital particles 10 16 perimeter measurements digital particles 10 9 periodic palette figure 2 8 perspective camera angle relative to object figure 3 6 perspective and distortion errors figure 6 perspective algorithm for calibration 3 11 phone technical support B 1 picture element 1 1 pixel depth 1 2 pixel frame shape examples figures 9 4 hexagonal frame 9 6 overview 9 4 square frame 9 6 pixel resolution definition 3 3 determining 3 3 relationship with field of view 3 4 pixels gray level values 1 1 neighbors 1 8 number of pixels in sensor 3 5 spatial coordinates 1 1 values for image border 1 8 IMAQ Vision Concepts Manual 1 14 planes number in image 1 3 PNG portable network graphics file format 1 5 predefined kernels Gaussian kernels A 7 gradient kernels Prewitt filters A 1 Sobel filters A 2 Laplacian kernels A 5 smoothing kernels A 6 predefined lookup tables 5 3 Prewitt filter basic concepts 5 27 examples 5 28 mathematical concepts 5 33 predefined kernels A 1 primary binary morphology functions See binary morphology professional services B 1 programming examples B 1 proper closing function binary morphology 9 20 grayscale morphology concep
212. ned by their hue saturation and luminance values Image Types The IMAQ Vision libraries can manipulate three types of images grayscale color and complex images Although IMAQ Vision supports all three image types certain operations on specific image types are not possible for example applying the logic operator AND to a complex image Table 1 1 shows how many bytes per pixel grayscale color and complex images use For an identical spatial resolution a color image occupies four times the memory space of an 8 bit grayscale image and a complex image occupies eight times the memory of an 8 bit grayscale image Table 1 1 Bytes per Pixel Image Type Number of Bytes per Pixel Data 8 bit Unsigned Integer Grayscale 1 byte or 8 bit for the grayscale intensity 8 bit 16 bit Signed Integer Grayscale 2 bytes or 16 bit for the grayscale intensity 16 bit National Instruments Corporation 1 3 IMAQ Vision Concepts Manual Chapter 1 Digital Images Table 1 1 Bytes per Pixel Continued Image Type Number of Bytes per Pixel Data 32 bit Floating Point Grayscale 4 bytes or 32 bit 32 bit for the grayscale intensity RGB Color 4 bytes or 32 bit 8 bit for the 8 bit for the 8 bit for the 8 bit for the alpha value red intensity green intensity blue intensity not used HSL Color 4 bytes or 32 bit 8 bit not use
213. nels The notations West W South S East E and North N indicate which edges of bright regions they outline Table A 2 Sobel Filters 16 W Edge 17 W Image 18 SW Edge 19 SW Image 1 0 1 1 0 1 0 1 2 0 1 2 2 0 2 2 1 2 1 0 1 1 1 1 1 0 1 1 0 1 2 1 0 2 1 0 20 S Edge 21 S Image 22 SE Edge 23 SE Image 1 2 1 1 2 1 2 1 0 2 1 0 0 0 0 0 1 0 1 0 1 1 1 1 1 2 1 1 2 1 0 1 2 0 1 2 24 E Edge 25 E Image 26 NE Edge 27 NE Image 1 0 1 1 0 1 0 1 2 0 1 2 2 0 2 2 1 2 1 0 1 1 1 1 1 0 1 1 0 1 2 10 2 10 28 N Edge 29 N Image 30 NW Edge 31 NW Image 1 2 1 1 2 1 2 1 0 2 1 0 00 0 0 1 0 1 0 1 1 1 1 1 2 1 1 2 1 0 1 2 0 1 2 IMAQ Vision Concepts Manual A 2 ni com Kernels Appendix A 5 x 5 Kernels The following table lists the predefined gradient 5 x 5 kernels Table A 3 Gradient 5 x 5 1 W Image 2 SW Edge 3 SW Image 0 W Edge 1 1 1 2 2 0 0 0 0 2 2 1 2 1 2 1 1 0 0 2 2 1 1 1 1 2 0 2 1 2 1 2 1 2 0 2 1 1 1 1 2 0 2 1 2 0 2 1 2 0 2 1 2 0 2 1 2 2 0 0 1 1 1 0 0 1 1 1 0 0 5 S Image 6 SE Edge 7 SE Image 4 S Edge 2 2 0 0 2 1 2 1 0 0 2 2 i 0 0 1 I l 1 1 2 2 0 0 2 0 2 1 0 0 2 2 l 0 0 1 I l 1 1 12 2 2 1 0 0 0 0 0 0 0 1 0 0 1 1 1 2 2 2 0 0 1 1 1 1 2 2 2 0 0 1 1 1 1 9 E Image 10 NE Edge 11 NE Image 8 E Edge 0 0 1 1 1 0 0 1 1 1 0 0
214. nes the location orientation and scale of a part being inspected Channel used to code extra information such as gamma correction about a color image The alpha channel is stored as the first byte in the four byte representation of an RGB pixel A rectangular portion of an acquisition window or frame that is controlled and defined by software Detects objects based on their size which can fall within a user specified range The image operations multiply divide add subtract and remainder Ordered indexed set of data elements of the same type A function that uses dual combinations of opening and closing operations to smooth the boundaries of objects O National Instruments Corporation G 1 IMAQ Vision Concepts Manual Glossary barycenter binary image binary morphology binary threshold bit depth black reference level blurring BMP border function brightness buffer IMAQ Vision Concepts Manual Bit One binary digit either 0 or 1 Byte Eight related bits of data an 8 bit binary number Also denotes the amount of memory required to store 1 byte of data The grayscale value representing the centroid of the range of an image s grayscale values in the image histogram An image in which the objects usually have a pixel intensity of 1 or 255 and the background has a pixel intensity of 0 Functions that perform morphological operations on a binary image Separation of an image into objects of
215. ng searches for the presence of a shape in a binary image and specifies the location of each matching shape IMAQ Vision detects the shape even if it is rotated or scaled Binary shape matching is performed by extracting from a template object the parameters that represent the shape of the object and are invariant to the rotation and scale of the shape These parameters are then compared with a similar set of parameters extracted from other objects Binary shape matching has the benefit of finding features regardless of size and orientation If you are searching for a feature that has a known shape but unknown size and orientation and the image can be thresholded consider using binary shape matching Binary shape matching is useful in robot guidance applications such as a robot arm sorting parts into groups of similar shapes National Instruments Corporation 12 9 IMAQ Vision Concepts Manual Chapter 12 Pattern Matching Shape Matching Concepts A shape matching function takes the following as inputs e An image of the shape template that you are looking for e A binary image containing the parts that you want to sort e A tolerance level that indicates the permitted degree of mismatch between the template and the parts The output is an image containing only the matched parts and a report detailing the location of each part the centroid of the part and a score that indicates the degree of match between the template and the part
216. nncnnos 15 2 TED HOST AES A a EE as 15 2 LED Algorithm Limits cut 15 2 Barcode uti ds din 15 3 Barcode Algorithm Limits oooonocnnncnccnonanoncnnnconconnnnncnn conc cnn nonn conc nnncnono 15 3 O National Instruments Corporation xiij IMAQ Vision Concepts Manual Contents Appendix A Kernels Appendix B Technical Support and Professional Services Glossary Index IMAQ Vision Concepts Manual xiv ni com About This Manual Conventions The IMAQ Vision Concepts Manual helps people with little or no imaging experience learn the basic concepts of machine vision and image processing This manual also contains in depth discussions on machine vision and image processing functions for advanced users bold italic monospace The following conventions appear in this manual The symbol leads you through nested menu items and dialog box options to a final action The sequence File Page Setup Options directs you to pull down the File menu select the Page Setup item and select Options from the last dialog box This icon denotes a tip which alerts you to advisory information This icon denotes a note which alerts you to important information Bold text denotes items that you must select or click in the software such as menu items and dialog box options Bold text also denotes parameter names Italic text denotes variables emphasis a cross reference or an introduction to a key concept This font also de
217. notes text that is a placeholder for a word or value that you must supply Text in this font denotes text or characters that you should enter from the keyboard sections of code programming examples and syntax examples This font is also used for the proper names of disk drives paths directories programs subprograms subroutines device names functions operations variables filenames and extensions and code excerpts O National Instruments Corporation XV IMAQ Vision Concepts Manual About This Manual Related Documentation IMAQ Vision Concepts Manual The following documents contain information that you might find helpful as you read this manual IMAQ Vision for LabVIEW User Manual IMAQ Vision for Measurement Studio User Manual LabWindows CVI IMAQ Vision for Measurement Studio User Manual Visual Basic IMAQ Vision for LabVIEW Help IMAQ Vision for LabWindows CVI Function Reference IMAQ Vision for Visual Basic Reference NI Vision Assistant Tutorial NI Vision Assistant Help NI Vision Builder for Automated Inspection Tutorial NI Vision Builder for Automated Inspection Configuration Help Xvi ni com Part I Vision Basics This section describes conceptual information about digital images image display and system calibration Part L Vision Basics contains the following chapters Chapter 1 Digital Images contains information about the properties of digital images image types and file formats
218. nspection problem by improving contrast or separating the object from the background Color location algorithms provide a quick way to locate regions in an image with specific colors Use color location when your application has the following characteristics e Requires the location and the number of regions in an image with their specific color information e Relies on the cumulative color information in the region instead of how the colors are arranged in the region e Does not require the orientation of the region e Does not require the location with subpixel accuracy The color location tools in IMAQ Vision measure the similarity between an idealized representation of a feature called a model and a feature that may be present in an image A feature for color location is defined as a region in an image with specific colors Color location is useful in many applications Color location provides your application with information about the number of instances and locations of the template within an image Use color location in the following general applications inspection identification and sorting Inspection Inspection detects flaws such as missing components incorrect printing and incorrect fibers on textiles A common pharmaceutical inspection application is inspecting a blister pack for the correct pills Blister pack inspection involves checking that all the pills are of the correct type which is easily performed by chec
219. o s O O National Instruments Corporation 2 9 IMAQ Vision Concepts Manual Chapter 2 Display You can define an ROI interactively programmatically or with an image mask Define an ROI interactively by using the tools from the tools palette to draw an ROI on a displayed image For more information about defining ROIs programmatically or with an image mask refer to your IMAQ Vision user manual Nondestructive Overlay When to Use IMAQ Vision Concepts Manual A nondestructive overlay enables you to annotate the display of an image with useful information without actually modifying the image You can overlay text lines points complex geometric shapes and bitmaps on top of your image without changing the underlying pixel values in your image only the display of the image is affected Figure 2 1 shows how you can use the overlay to depict the orientation of each particle in the image Angle 90 Bible 150 Figure 2 1 Nondestructive Overlay You can use nondestructive overlays for many purposes such as the following e Highlighting the location in an image where objects have been detected e Adding quantitative or qualitative information to the displayed image like the match score from a pattern matching function e Displaying ruler grids or alignment marks 2 10 ni com Chapter 2 Display Nondestructive Overlay Concepts Overlays do not affect the results of any analysis or processing funct
220. ocations providing more accurate results Figure 14 12 Comparison between Color Location and Color Pattern Matching National Instruments Corporation 14 17 IMAQ Vision Concepts Manual Chapter 14 Color Inspection Figure 14 13 shows the advantage of using color information when locating color coded fuses on a fuse box Figure 14 13a shows a grayscale image of the fuse box In the image of the fuse box in Figure 14 13a the grayscale pattern matching tool has difficulty clearly differentiating between fuse 20 and fuse 25 and will return close match scores because of similar grayscale intensities and the translucent nature of the fuses In the color image In the color image Figure 14 13b color helps to separate the fuses The addition of color helps to improve the accuracy and reliability of the pattern matching tool IMAQ Vision Concepts Manual Figure 14 13 Benefit of Adding Color to Fuse Box Inspection The color pattern matching tools in IMAQ Vision measure the similarity between an idealized representation of a feature called a model and the feature that may be present in an image A feature is defined as a specific pattern of color pixels in an image Color pattern matching is the key to many applications Color pattern matching provides your application with information about the number of instances and location of the template within an image Use color pattern matching in the following three g
221. offer the possibility to display about 16 7 million colors Understanding your display mode is important to understanding how IMAQ Vision displays the different image types on a screen Image processing functions often use grayscale images Because display screen pixels are made of red green and blue components the pixels of a grayscale image cannot be rendered directly In 24 bit or 32 bit display mode the display adaptor uses 8 bits to encode a grayscale value offering 256 gray shades This color resolution is sufficient to display 8 bit grayscale images However higher bit depth images such as 16 bit grayscale images are not accurately represented in 24 bit or 32 bit display mode To display a 16 bit grayscale image either ignore the least significant bits or use a mapping function to convert 16 bits to 8 bits Mapping Methods for 16 Bit Image Display The following techniques describe how IMAQ Vision converts 16 bit images to 8 bit images and displays them using mapping functions Mapping functions evenly distribute the dynamic range of the 16 bit image to an 8 bit image e Full Dynamic The minimum intensity value of the 16 bit image is mapped to 0 and the maximum intensity value is mapped to 255 All other values in the image are mapped to lie between 0 and 255 using the equation shown below This mapping method is general 2 2 ni com Chapter 2 Display purpose because it insures the display of the complete dynamic range
222. ology 9 29 closing function binary morphology basic concepts 9 13 examples 9 13 grayscale morphology description 5 38 examples 5 38 ni com clustering technique in automatic thresholding examples 8 4 in depth discussion 8 7 overview 8 3 CMY color space description 1 19 transforming RGB to CMY 1 25 color distribution comparing 14 9 learning 14 9 color identification example 14 7 purpose and use 14 6 14 11 sorting objects 14 11 color images encoding 1 5 histogram of color image 4 5 color inspection 14 7 color location basic concepts 14 14 identification of objects 14 11 inspection 14 10 overview 14 10 sorting objects 14 11 what to expect ambient lighting conditions 14 13 blur and noise conditions 14 14 pattern orientation and multiple instances 14 13 when to use 14 10 color matching basic concepts 14 8 comparing color distributions 14 9 learning color distribution 14 9 overview 14 5 when to use color identification 14 6 color inspection 14 7 National Instruments Corporation Index color pattern matching basic concepts color matching and color location 14 21 combining color location and grayscale pattern matching 14 22 grayscale pattern matching 14 22 what to expect ambient lighting conditions 14 20 blur and noise conditions 14 21 pattern orientation and multiple instances 14 19 when to use alignment 14 19 gauging 14 18 inspection 14 19 color spaces
223. olor 1 5 definition 1 1 internal representation of IMAQ Vision image 1 6 number of planes 1 3 pixel depth 1 2 inner gradient function binary morphology 9 14 inspection color inspection 14 7 color location 14 10 color pattern matching 14 18 pattern matching 12 1 instrument drivers B 1 instrument readers barcode 15 3 LCD functions 15 2 meter functions 15 1 when to use 15 1 intensity measurements densitometry parameters 4 6 when to use 4 6 intensity threshold 8 2 interclass variance technique in automatic thresholding 8 8 internal edge function binary morphology 9 14 internal representation of IMAQ Vision image 1 6 IMAQ Vision Concepts Manual interpretation of histogram 4 4 inverse gamma correction See logarithmic and inverse gamma correction J JPEG Joint Photographic Experts Group format 1 6 K kernel definition Gaussian filters 5 26 gradient filters 5 16 Laplacian filters 5 21 smoothing filters 5 24 KnowledgeBase B 1 L labeling function binary morphology 9 23 Laplacian filters contour extraction and highlighting 5 21 contour thickness 5 23 example 5 20 kernel definition 5 21 predefined kernels A 5 LCD functions algorithm limits 15 2 purpose and use 15 2 length measurements digital particles 10 9 to 10 12 lens focal length setting 3 5 lighting conditions in pattern matching 12 4 line detection functions in dimensional measurements 13 9 line fitting funct
224. oncepts Manual Chapter 10 Particle Measurements 2 5 7 5 3 5 7 5 52 23 42 27 AE gt PERSA gt 51 30 3 8 2 5 8 5 3 5 8 5 46 31 1 Point to Polygon 2 Pixel Coordinates to Real World Coordinates Figure 10 1 Pixel Coordinates to Real World Coordinates A pixel at 3 8 is a square with corners at 2 5 7 5 3 5 7 5 3 5 8 5 and 2 5 8 5 To make real world measurements the four corner coordinates are transformed from pixel coordinates into real world coordinates Using real world coordinates the area and moments of the pixel can be integrated Similarly the area and moments of an entire particle can be computed using the calibrated particle contour points Particle Measurements This section contains tables that list and describe the IMAQ Vision particle measurements Table 10 1 lists concepts common to many of the particle measurements IMAQ Vision Concepts Manual 10 2 ni com Particle Concepts Chapter 10 Particle Measurements Table 10 1 contains concepts relating to particle measurements Table 10 1 Particle Concepts Concept Definition Bounding Rect Smallest rectangle with sides parallel to the x axis and y axis that completely encloses the particle 0 0 Lan ing 5 Top pr i J Z gt o I Bottom posse y Width Perimeter Length of a boundary of a region Because the boundary of a binary image is comprised of discrete pixels IMAQ Vi
225. opening and closing examples 5 38 overview 5 37 proper closing concepts and mathematics 5 41 overview 5 39 proper opening concepts and mathematics 5 40 overview 5 39 when to use 5 35 grayscale pattern matching combining color location and grayscale pattern matching 14 22 methods 14 22 H help professional services B 1 technical support B 1 hexagonal pixel frame 9 6 highpass filters binary morphology basic concepts 9 23 effects table 9 23 example 9 24 classes table 5 15 definition 5 14 highpass frequency FFT filters attenuation 7 8 examples 7 9 overview 7 2 truncation 7 9 histogram basic concepts 4 2 color image histogram 4 5 cumulative histogram 4 3 definition 4 1 interpretation 4 4 linear histogram 4 3 ni com scale of histogram 4 4 when to use 4 1 hit miss function binary morphology basic concepts 9 14 examples 9 15 strategies for using table 9 16 hole filling function binary morphology 9 22 hole measurement digital particles holes area 10 5 holes perimeter 10 9 HSL color space basic concepts 1 17 generating color spectrum 14 1 mapping RGB to HSL color space 1 20 Hu moments 10 5 hue definition 1 17 hydraulic radius parameter 10 13 l image analysis histogram basic concepts 4 2 color image histogram 4 5 cumulative histogram 4 3 interpretation 4 4 linear histogram 4 3 scale of histogram 4 4 when to use 4 1 intensity measurements 4 6 l
226. or matching can be used for applications such as color identification color inspection color object location and other applications that require the comparison of color information to make decisions Color Identification Color identification identifies an object by comparing the color information in the image of the object to a database of reference colors that correspond to pre defined object types The object is assigned a label corresponding to the object type with closest reference color in the database Use color matching to first learn the color information of all the pre defined object types The color spectrums associated with each of the pre defined object types become the reference colors Your machine vision application then uses color matching to compare the color information in the image of the object to the reference color spectrums The object receives the label of the color spectrum with the highest match score Figure 14 5 shows an example of a tile identification application Figure 14 5a shows the image of a tile that needs to be identified Figure 14 5b shows the scores obtained using color matching with a set of the reference tiles 14 6 ni com Chapter 14 Color Inspection a 1 Score 592 2 Score 6 3 Score 31 5 Score 1000 4 Score 338 6 Score 405 Figure 14 5 Color Matching Use color matching to verify the presence of correct components in automot
227. oration 11 7 IMAQ Vision Concepts Manual Chapter 11 Edge Detection point which you define by setting the steepness parameter This number corresponds to the expected transition region in the edge profile Define the number of pixels averaged on each side by setting the width parameter After computing the average the software computes the difference between these averages to determine the contrast Filtering reduces the effects of noise along the profile If you expect the image to contain a lot of noise use a large filter width Figure 11 9 shows the relationship between the parameters and the edge profile To find the edge the software scans across the one dimensional grayscale profile pixel by pixel At each point the edge strength contrast is computed If the contrast at the current point is greater than the user set value for the minimum contrast for an edge the point is stored for further analysis Starting from this point successive points are analyzed until the contrast reaches a maximum value and then falls below that value The point where the contrast reaches the maximum value is tagged as the start edge location The value of the steepness parameter is added to the start edge location to obtain the end edge location The first point between the start edge location and end edge location where the difference between the point intensity value and the start edge value is greater than or equal to half the difference between the start
228. oriz Segment Right Max Horiz Segment Length Right is always given as a pixel measurement Max Horiz Y coordinate for all of the pixels Segment Length in the Max Horiz Segment Max Row Horiz Segment Length Row is always given as a pixel measurement IMAQ Vision Concepts Manual 10 8 ni com Lengths Chapter 10 Particle Measurements Table 10 3 lists the IMAQ Vision particle relating to length Table 10 3 Lengths Axis Feret particle and Major Axis equal in length to the Max Feret Diameter Measurement Definition Symbol Equation Bounding Rect Distance between Bounding Rect W Br B Width Left and Bounding Rect Right Bounding Rect Distance between Bounding Rect H Bp Br Height Top and Bounding Rect Bottom Bounding Rect Distance between opposite W R Diagonal corners of the Bounding Rect Perimeter Length of the outer boundary of P the particle Because the boundary is comprised of discrete pixels IMAQ Vision subsamples the boundary points to approximate a smoother more accurate perimeter Convex Hull Perimeter of the Convex Hull Pou Perimeter Holes Perimeter Sum of the perimeters of each hole in the particle Max Feret Distance between the Max Feret F 2 2 Diameter Diameter Start and the Max Feret JEn A ea Fy Diameter End Equivalent Length of the major axis of the E gt P 2A P 24 Ellipse Major Equivalent
229. ors caused by a camera imaging the grid from an angle 00000090 009 eeeeee eo e 0 00 0 00 e tjon 00000 e ee 000000 o oo eoo ooo e o EEEE EKIPE e000 ooo IMAQ Vision Concepts Manual Figure 3 4 Perspective and Distortion Errors Try to position your camera perpendicular to the object you are trying to inspect to reduce perspective errors If you need to take precise measurements from your image correct perspective error by applying calibration techniques to your image 3 6 ni com Chapter 3 System Setup and Calibration Distortion Nonlinear distortion is a geometric aberration caused by optical errors in the camera lens A typical camera lens introduces radial distortion This causes points that are away from the lens s optical center to appear further away from the center than they really are Figure 3 4c illustrates the effect of distortion on a grid of dots When distortion occurs information in the image is misplaced relative to the center of the field of view but the information is not necessarily lost Therefore you can undistort your image through spatial calibration Spatial Calibration When to Use Spatial calibration is the process of computing pixel to real world unit transformations while accounting for many errors inherent to the imaging setup Calibrating your imaging setup is important when you need to make accurate measurements in real world units An
230. osion are not restored by the dilation If Z is an image opening l dilation erosion 1 The closing function is a dilation followed by an erosion This function fills tiny holes and smooths boundaries This operation does not significantly alter the area and shape of particles because dilation and erosion are morphological complements where borders expanded by the dilation function are then reduced by the erosion function However erosion does not restore any tiny holes filled during dilation If Z is an image closing erosion dilation 1 The following figures illustrate examples of the opening and closing function 1 1 1 1 1 1 1 1 1 Original Image Structuring Element After Opening 1 1 1 1 1 001 0 0 1 1 1 1 1 O 1 110 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 O 1 110 1 1 1 1 1 001 0 0 Structuring Element After Opening Structuring Element After Closing Figure 9 14 Opening and Closing Functions National Instruments Corporation 9 13 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology IMAQ Vision Concepts Manual Inner Gradient Function The internal edge subtracts the eroded image from its source image The remaining pixels correspond to the pixels eliminated by the erosion process If J is an image internal edge I I erosion I XORU erosion 1 Outer Gradient Function The external edge subtracts the source image from the dilated image of the source image The remaining pixels correspond
231. ourier 1D transform of the columns of the previous results The complex numbers that compose the FFT plane are encoded in a 64 bit floating point image called a complex image The complex image is National Instruments Corporation 7 1 IMAQ Vision Concepts Manual Chapter 7 Frequency Domain Analysis When to Use IMAQ Vision Concepts Manual formed by a 32 bit floating point number representing the real part and a 32 bit floating point number representing the imaginary part In an image details and sharp edges are associated with moderate to high spatial frequencies because they introduce significant gray level variations over short distances Gradually varying patterns are associated with low spatial frequencies By filtering spatial frequencies you can remove attenuate or highlight the spatial components to which they relate Use a lowpass frequency filter to attenuate or remove or truncate high frequencies present in the image This filter suppresses information related to rapid variations of light intensities in the spatial image An inverse FFT used after a lowpass frequency filter produces an image in which noise details texture and sharp edges are smoothed A highpass frequency filter attenuates or removes or truncates low frequencies present in the complex image This filter suppresses information related to slow variations of light intensities in the spatial image In this case an inverse FFT used after a highpass f
232. oving particles that touch the border of the image You can use these parameters to identify and classify particles The following table shows the values obtained for the particle enclosed in a rectangle shown in the figure below Particle Measurement Values Area 2456 Number of Holes 1 Bounding Rect Left 127 Top 8 Right 200 Bottom 86 Center of Mass X 167 51 Y 37 61 Orientation 82 36 Dimensions Width 73 Height 78 O National Instruments Corporation Ill 3 IMAQ Vision Concepts Manual Part III Particle Analysis To use particle analysis first create a binary image using a thresholding process You then can improve the binary image using morphological transformations and make measurements on the particles in the image IMAQ Vision Concepts Manual 111 4 ni com Thresholding This chapter contains information about thresholding and color thresholding Introduction Thresholding consists of segmenting an image into two regions a particle region and a background region This process works by setting to 1 all pixels that belong to a gray level interval called the threshold interval and setting all other pixels in the image to 0 Use thresholding to isolate objects of interest in an image Thresholding converts the image from a grayscale image with pixel values ranging from 0 to 255 to a binary image with pixel values of 0 or 1 Thresholding enables you to select r
233. own in Figure 1 3b You can copy the values of the pixels along the edge of the image into the border pixels as shown in Figure 1 3c or you can mirror the pixels values along the edge of the image into the border pixels as shown in Figure 1 3d 1 8 ni com Chapter 1 Digital Images 0 0 O0 O0 O OJOJO OJOJO OJOJO 0 0 O0 O0 O OJOJO OJOJO OJOJO 0 0 10 9 15 11 12 20 16 12 11 8 10 0 ojo 11j13 11 12 9 16 17 11 13 14 0 0 0 0 12 8 12 14 12 13 12 14 11 13 0 0 0 0 10 9 13 31 30 32 33 12 13 110 0 0 0 15 11 10 30 42 45 31 15 12 10 0 0 0 0 13 12 14 29 40 41 33 13 12 13 0 0 0 O 14 15 12 33 34 36 32 12 14 11 0 0 0 0 10 12 13 14 12 16 12 15 10 910 0 0 0 10 8 11 13 15 17 13 14 12 10 0 0 0 0J 9 10 12 11 8 15 14 12 11 7 10 0 0 O0 0O O O OJOJO OJOJO OJOJO o0jojojo0Jo0 0J0OJOJOJOJOJOJO O a b 10 10 10 9 15 11 12 20 16 22 11 8 8 8 13 11 11 13 11 12 9 16 17 11 13 14 14 13 10 10 10 9 15 11 12 20 16 22 11 8 8 8 9 110 10 9 15 11 12 20 16 12 11 8 8 11 10 10 10 9 15 11 12 20 16 12 11 8 8 8 9 10 10 9 15 11 12 20 16 12 11 8 8 11 11111f1113 11 12 9 16 17 11 13
234. predefined coordinate reference and scaling factors To perform a simple calibration define a coordinate system and scaling mode Figure 3 11 illustrates how to define a coordinate system To set a coordinate reference define the angle between the x axis and the horizontal axis of the image in degrees Express the center as the position in pixels where you want the coordinate reference origin Set the axis direction to direct or indirect Set the scaling mode option to scale to fit or scale to preserve area Simple Calibration also offers a correction table option 3 Note Ifyou use simple calibration with the angle set to 0 you do not need to learn for correction because you do not need to correct your image IMAQ Vision Concepts Manual 3 16 ni com Chapter 3 System Setup and Calibration 1 Default Origin 2 New Origin Figure 3 11 Defining a New Coordinate System Redefining a Coordinate System You can use simple calibration to change the coordinate system assigned to a calibrated image When you define a new coordinate system remember the following Express the origin in pixels Always choose an origin location that lies within the calibration grid so that you can convert the location to real world units e Specify the angle as the angle between the new coordinate system and the horizontal direction in the real world In some vision applications you may need to image several regions of an object to
235. r noise Blurring usually occurs because of incorrect focus or depth of field changes Color Location Concept IMAQ Vision Concepts Manual Color location is built upon the color matching functions to quickly locate regions with specific color information in an image Refer to the Color Matching section for more information The color location functions extend the capabilities of color matching to applications in which the location of the objects in the image is unknown Color location uses the color information in a template image to look for occurrences of the template in the search image The basic operation moves the template across the image pixel by pixel and comparing the color information at the current location in the image to the color information in the template using the color matching algorithm The color location process consists of two main steps learning template information and searching for the template in an image Figure 14 11 illustrates the general flow of the color location process During the learning phase the software extracts the color spectrum from the template image This color spectrum is used to compare the color information of the template with the color information in the image 14 14 ni com Chapter 14 Color Inspection During the search step a region the size of the template is moved across the image pixel by pixel from the top of the image to the bottom At each pixel the function computes the color
236. ram 4 1 scale of histograms 4 4 scale invariant matching 12 5 scaling mode in calibration 3 14 segmentation function basic concepts 9 26 compared with skiz function 9 27 sensation of colors 1 14 sensor size definition 3 3 number of pixels in sensor 3 5 separation function binary morphology 9 24 shape equivalence digital particles ellipse ratio 10 11 equivalent ellipse axes 10 11 equivalent rectangle 10 12 rectangle ratio 10 12 shape features digital particles hydraulic radius 10 13 shape matching basic concepts 12 10 dimensional measurement 13 11 example 12 10 when to use 12 9 Sigma filter basic concepts 5 29 mathematical concepts 5 34 skeleton functions comparison between segmentation and skiz functions 9 27 L skeleton 9 25 IMAQ Vision Concepts Manual Index M skeleton 9 26 skiz 9 26 skiz function basic concepts 9 26 compared with segmentation function 9 27 smoothing filters example 5 24 kernel definition 5 24 predefined kernels A 6 Sobel filter basic concepts 5 27 example 5 28 mathematical concepts 5 33 predefined kernels A 2 software drivers B 1 spatial calibration algorithms 3 11 coordinate system 3 9 correction region 3 15 definition 3 7 image correction 3 13 overview 3 7 process of calibration 3 8 quality information 3 12 redefining coordinate systems 3 17 scaling mode 3 14 simple calibration 3 16 when to use 3 7 spatial filters categories 5 14 classi
237. range M S M S Lowpass Filter If Pa p M lt S Then Paj Paj Else Pap M Given M the mean value of Pa and its neighbors and S their standard deviation each pixel Pa j is set to the mean value M if it falls outside the range M S M S Median Filter Pa median value of the series Pq m Nth Order Filter Pa p Nth value in the series Po m where the Pin m are sorted in increasing order 5 34 ni com Chapter 5 Image Processing The following example uses a 3 x 3 neighborhood 13 10 9 12 4 8 5 5 6 The following table shows the new output value of the central pixel for each Nth order value Nth Order 1 2 3 4 5 6 7 8 New Pixel Value 4 5 5 6 8 9 10 12 13 Notice that for a given filter size f the Nth order can rank from 0 to f 1 For example in the case of a filter size 3 the Nth order ranges from 0 to 8 82 1 Grayscale Morphology When to Use Morphological transformations extract and alter the structure of particles in an image They fall into two categories e Binary Morphology functions which apply to binary images e Grayscale morphology functions which apply to gray level images In grayscale morphology a pixel is compared to those pixels surrounding 1t in order to keep those pixel values that are the smallest erosion or the largest dilation Use grayscale morphology functions to fil
238. rce image 5 36 ni com Chapter 5 Image Processing Table 5 3 provides example structuring elements and the corresponding eroded and dilated images Table 5 3 Erosion and Dilation Examples Structuring Element Erosion Dilation 1 1 1 1 1 1 1 1 1 0 1 1 1 1 0 1 Opening Function The gray level opening function consists of a gray level erosion followed by a gray level dilation It removes bright spots isolated in dark regions and smooths boundaries The effects of the function are moderated by the configuration of the structuring element opening l dilation erosion 1 This operation does not significantly alter the area and shape of particles because erosion and dilation are morphological opposites Bright borders reduced by the erosion are restored by the dilation However small bright particles that vanish during the erosion do not reappear after the dilation National Instruments Corporation 5 37 IMAQ Vision Concepts Manual Chapter 5 Image Processing IMAQ Vision Concepts Manual Closing Function The gray level closing function consists of a gray level dilation followed by a gray level erosion It removes dark spots isolated in bright regions and smooths boundaries The effects of the function are moderated by the configuration of the structuring element closing l erosion dilation 1 This operation does not significantly alter the area and shape of particles because d
239. re 7 1 FFT of an Image in Standard Representation Optical Representation In the optical representation low frequencies are grouped at the center of the image while high frequencies are located at the edges The constant term or null frequency is at the center of the image The frequency range is as follows HE 2 2 2 NIZ IMAQ Vision Concepts Manual 7 4 ni com Chapter 7 Frequency Domain Analysis High Low High ye requencies D C Low Frequencies Low B High A a Frequencies High Low High Figure 7 2a shows the same original image as shown in Figure 7 1a Figure 7 2b shows the FFT of the image in optical representation a Original Image b FFT in Optical Representation Figure 7 2 FFT of an Image in Optical Representation 3 Note This is the representation IMAQ Vision uses when displaying a complex image You can switch from standard representation to optical representation by permuting the A B C and D quarters National Instruments Corporation 7 5 IMAQ Vision Concepts Manual Chapter 7 Frequency Domain Analysis IMAQ Vision Concepts Manual Intensities in the FFT image are proportional to the amplitude of the displayed component Lowpass FFT Filters A lowpass frequency filter attenuates or removes high frequencies present in the FFT plane This filter suppresses information related to rapid variations of light intensities in the spatial image In this case
240. region Figure 11 13 illustrates the basics of the concentric rake Edge detection is performed along search lines that occur in the search region and that are concentric to the outer circular boundary Control the number of concentric search lines that are used for the edge detection by specifying the radial distance between the concentric lines in pixels Specify the direction of the search as either clockwise or anti clockwise IMAQ Vision Concepts Manual 11 12 ni com Chapter 11 Edge Detection 1 Search Area 2 Search Line 3 Search Direction 4 Edge Points Figure 11 13 Concentric Rake Function National Instruments Corporation 11 13 IMAQ Vision Concepts Manual Pattern Matching This chapter contains information about pattern matching and shape matching Introduction Pattern matching locates regions of a grayscale image that match a predetermined template Pattern matching finds template matches regardless of poor lighting blur noise shifting of the template or rotation of the template Use pattern matching to quickly locate known reference patterns or fiducial in an image With pattern matching you create a model or template that represents the object for which you are searching Then your machine vision application searches for the model in each acquired image calculating a score for each match The score relates how closely the model matches the pattern fo
241. rence between the reference coordinate system and the new coordinate system Based on this difference the software moves the new measurement areas with respect to the new coordinate system 13 4 ni com Chapter 13 Dimensional Measurements Figure 13 1a shows a reference image with a defined reference coordinate system Figure 13 1b shows an inspection image with an updated coordinate system O y 2 Object Edges 1 Search Area for the Coordinate System 3 Origin of the Coordinate System 4 Measurement Area Figure 13 1 Coordinate Systems of a Reference Image and Inspection Image In Depth Discussion You can use four different strategies to build a coordinate system Two strategies are based on detecting reference edges of the object under inspection The other two strategies involve locating a specific pattern using a pattern matching algorithm Edge Based Coordinate System Functions These functions determine the axis of the coordinate system by locating edges of the part under inspection Use an edge based method if you can identify two straight distinct non parallel edges on the object you want to locate Because the software uses these edges as references for creating the coordinate system choose edges that are unambiguous and always present in the object under inspection Single Search Area This method involves locating the two axes of the coordinate system t
242. requency filter produces an image in which overall patterns are sharpened and details are emphasized A mask frequency filter removes frequencies contained in a mask specified by the user Using a mask to alter the Fourier transform of an image offers more possibilities than applying a lowpass or highpass filter The image mask is composed by the user and can describe very specific frequencies and directions in the image You can apply this technique for example to filter dominant frequencies as well as their harmonics in the frequency domain In an image details and sharp edges are associated with moderate to high spatial frequencies They are associated because details and sharp edges introduce significant gray level variations over short distances Gradually varying patterns are associated with low spatial frequencies An image can have extraneous noise introduced during the digitization process such as periodic stripes In the frequency domain the periodic pattern is reduced to a limited set of high spatial frequencies Truncating these particular frequencies and converting the filtered FFT image back to the spatial domain produces a new image in which the grid pattern has disappeared while the overall features remain 7 2 ni com Chapter 7 Frequency Domain Analysis Fast Fourier Transform Concepts The FFT of an image is a two dimensional array of complex numbers also represented as a complex image It represents the frequencies
243. res matt ia 13 2 Making Measurements oneri r nro E 13 2 Qualifying MeasurementS ooccocononconconnnoncnnnonncnncnnnonnconncnncrononncnnncincnns 13 3 Coordinate Sy Sten coito tt es ra 13 3 When to UsEiucata aie des ee ase eee 13 3 Concepts tati ia ti 13 4 In Depth Discussion sinira prian des tacon data 13 5 Finding Features or Measurement Points cece eeeeseeseeeseseeeeseeseeeaees 13 9 AA AN 13 9 Line and Circular Features mcenia aoni 13 9 Shape Based Features odinate arere a a a E ai 13 11 Making Measurements On the Image oooconncnnnonocnnonconcnnnnnncnnnnnncnn conc cononnncnncrnno 13 12 Distance Measurements natane a aa R E i 13 12 Analytic Geometty lucrar 13 13 IMAQ Vision Concepts Manual xii ni com Contents Chapter 14 Color Inspection The Color Spectra na En 14 1 Color Space Used to Generate the Spectrum oooooonocnncnnonnnonconcnnnonnconncnnccnncnnons 14 1 Generating the Color Spectrum ceeeseeseceeceeeeeeeaeceeeeeseeeaesseeeeees 14 3 Color Matching ur tata dida 14 5 When to Use viii A he 14 6 Color Identification criteria 14 6 C lor N pe tO tii E EE EA SAE EEES 14 7 Color Matching Concepts e ee sus lt iria A ERE 14 8 Learning Color DistributiON oooonnnocnnncnicnnonnnncnancnncnnncnnconocancnanonncnnos 14 9 Comparing Color DistributiONS oooononocnnonnocnnoncnanonncononancnncnancanonnnono 14 9 Color Loco edie a E TS 14 10 Whemto USC heehee Seite ede tdt ii 14 10 Inspection isis haseies Sas he ees Meh i ee t
244. ring Elements When to Use Morphological operators that change the shape of particles process a pixel based on its number of neighbors and the values of those neighbors A neighbor is a pixel whose value affects the values of nearby pixels during certain image processing functions Morphological transformations use a 2D binary mask called a structuring element to define the size and effect of the neighborhood on each pixel controlling the effect of the binary morphological functions on the shape and the boundary of a particle Use a structuring element when you perform any primary binary morphology operation or the advanced binary morphology operation Separation You can modify the size and the values of a structuring element to alter the shape of particles in a specific way However study the basic morphology operations before defining your own structuring element O National Instruments Corporation 9 1 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology Structuring Elements Concepts IMAQ Vision Concepts Manual The size and contents of a structuring element specify which pixels a morphological operation takes into account when determining the new value of the pixel being processed A structuring element must have an odd sized axis to accommodate a center pixel which is the pixel being processed The contents of the structuring element are always binary composed of and 0 values The most common structuring element is a 3 x 3
245. rk with meters or gauges that have either a dark needle on a light background or a light needle on a dark background National Instruments Corporation 15 1 IMAQ Vision Concepts Manual Chapter 15 Instrument Readers LCD Functions IMAQ Vision Concepts Manual Meter Algorithm Limits This section explains the limit conditions of the algorithm used for the meter functions The algorithm is fairly insensitive to light variations The position of the base of the needle is very important in the detection process Carefully draw the lines that indicate the initial and the full scale position of the needle The coordinates of the base and of the points of the arc curved by the tip of the needle are computed during the setup phase These coordinates are used to read the meter during inspection LCD functions simplify and accelerate the development of applications that require reading values from seven segment displays Use these functions to extract seven segment digit information from an image The reading process consists of two phases e A learning phase during which the user specifies an area of interest in the image to locate the seven segment display e A reading phase during which the area specified by the user is analyzed to read the seven segment digit The IMAQ Vision LCD functions provide the high level vision processes required for recognizing and reading seven segment digit indicators The VIs in this library are designed for
246. rocessing Median Filter The median filter is a lowpass filter It assigns to each pixel the median value of its neighborhood effectively removing isolated pixels and reducing detail However the median filter does not blur the contour of objects You can implement the median filter by performing an Nth order filter and setting the order to f 1 2 for a given filter size of fx f Nth Order Filter The Nth order filter is an extension of the median filter It assigns to each pixel the Nth value of 1ts neighborhood when sorted in increasing order The value N specifies the order of the filter which you can use to moderate the effect of the filter on the overall light intensity of the image A lower order corresponds to a darker transformed image a higher order corresponds to a brighter transformed image To see the effect of the Nth order filter notice the example of an image with bright objects and a dark background When viewing this image with the gray palette the objects have higher gray level values than the background For a Given Filter Size f x f Example of a Filter Size 3 x 3 dark regions minimum e HN lt f 2 _ 1 2 the Nth order filter has the Order 0 tendency to erode bright regions or dilate e IfN 0 each pixel is replaced by its local objects smooths image erodes bright IMAQ Vision Concepts Manual 5 30 ni com Chapter 5 Image Processing For a Given Filter
247. rom ROIs rather than from an entire image To use this technique the necessary parts of the object must always appear inside the ROIs you define Usually the object under inspection appears shifted or rotated within the images you want to process When this occurs the ROIs need to shift and rotate in the same way as the object In order for the ROIs to move in relation to the object you must locate the object in every image Locating the object in the image involves determining the x y position and the orientation of the object in the image using the reference coordinate system functions You can build a coordinate reference using edge detection or pattern matching Locating Features To gauge an object you need to find landmarks or object features on which you can base your measurements In most applications you can make measurements based on points detected in the image or geometric fits to the detected points Object features that are useful for measurements fall into two categories e Edge points located along the boundary of an object located by edge detection method e Shapes or patterns within the object located by pattern matching Making Measurements You can make different types of measurements from the features found in the image Typical measurements include the distance between points the angle between two lines represented by three or four points the best line circular or elliptical fits and the areas of geometric s
248. rosions e Uses k 1 erosions N is an odd number N 2k 1 e Removes particles with a width Removes particles with a width greater than or equal to 2k 1 less than or equal to 2k Uses k erosions e Uses k erosions National Instruments Corporation 9 23 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology IMAQ Vision Concepts Manual Figure 9 22a shows the binary source image used in this example Figure 9 22b shows how for a given filter size a highpass filter produces the following image Gray particles and white particles are filtered out by a lowpass and highpass filter respectively Figure 9 22 Lowpass and Highpass Filter Functions Separation Function The separation function breaks narrow isthmuses and separates touching particles with respect to a user specified filter size This operation uses erosions labeling and conditional dilations For example after thresholding an image two gray level particles overlapping one another might appear as a single binary particle You can observe narrowing where the original particles intersected each other If the narrowing has a width of M pixels a separation using a filter size of M 1 breaks it and restores the two original particles This applies to all particles that contain a narrowing shorter than N pixels For a given filter size N the separation function segments particles with a narrowing shorter than or equal to
249. rs For each pixel Pq in an image where i and j represent the coordinates of the pixel the convolution kernel is centered on Pa p Each pixel masked by the kernel is multiplied by the coefficient placed on top of it Po j becomes the sum of these products divided by the sum of the coefficient or 1 whichever is greater In the case of a 3 x 3 neighborhood the pixels surrounding Pg and the coefficients of the kernel K can be indexed as follows P 1j 0 Paj n PGsij 1 KG 1 j 1 KG j 1 Kos1 j 1 P 1 Pua Pasta K 1 Ki Kosi Pa 1 j Pajan Partio Ki 1j 1 KG 1 Ka ij D IMAQ Vision Concepts Manual The pixel Pa is given the value 1 M2 K Pia 13 With a ranging from i 1 to i 1 and b ranging from j 1 to j 1 Nis the normalization factor equal to 2 Ka or 1 whichever is greater If the new value P j is negative it is set to 0 If the new value Py is greater than 255 it is set to 255 in the case of 8 bit resolution The greater the absolute value of a coefficient K the more the pixel Pia b contributes to the new value of Pq p If a coefficient Ka is O the neighbor Pia p does not contribute to the new value of Pa p notice that Poa might be Pa p itself If the convolution kernel is 0 0 0 2 1 2 0 0 0 then Pus RPP 1 7 Pap 2P s Lp 5 32 ni com Chapter 5 Image Processing If the convolution kernel is 0
250. rticles in an image The technique used to locate quickly a grayscale template within a grayscale image The ratio of the active pixel region to the active line region For standard video signals like RS 170 or CCIR the full size picture aspect ratio normally is 4 3 1 33 An element of a digital image Also called pixel Picture element The smallest division that makes up the video scan line For display on a computer monitor a pixel s optimum dimension is square aspect ratio of 1 1 or the width equal to the height The ratio between the physical horizontal size and the vertical size of the region covered by the pixel An acquired pixel should optimally be square thus the optimal value is 1 0 but typically it falls between 0 95 and 1 05 depending on camera quality Directly calibrating the physical dimensions of a pixel in an image The number of bits used to represent the gray level of a pixel Portable Network Graphic Image file format for storing 8 bit 16 bit and color images with lossless compression extension PNG Similar to a logarithmic function but with a weaker effect See exponential function Extracts the contours edge detection in gray level values using a 3 x 3 filter kernel Defines the probability that a pixel in an image has a certain gray level value A finite combination of successive closing and opening operations that you can use to fill small holes and smooth the boundaries of objects A f
251. ry the entropy of the histogram signifies the amount of information associated with the histogram Let p AO Y h i 0 represent the probability of occurrence of the gray level i The entropy of a histogram of an image with gray levels in the range 0 N 1 is given by i N 1 H y p i log pi i 0 National Instruments Corporation 8 7 IMAQ Vision Concepts Manual Chapter 8 Thresholding If k is the value of the threshold then the two entropies i k H S plloge pli i 0 i N 1 Y pOlogep i k 1 represent the measures of the entropy information associated with the black and white pixels in the image after thresholding The optimal threshold value is gray level value that maximizes the entropy in the thresholded image given by H H H Simplified the threshold value is the pixel value k at which the following expression is maximized i k i N 1 F toget 1 gt loge h i DOG loge 570 50O i k 1 i 0 i k 1 InterVariance The InterVariance method works similarly to the clustering technique Metric The threshold value is the pixel value k at which the following expression is minimized i k i N 1 Nan Y AO G bs i 0 i k 1 where u is the mean of all pixel values in the image that lie between 0 and k and LL is the mean of all the pixel values in the image that lie between k 1 and 255 IMAQ Vision Concepts Manual 8 8 ni com Chapter 8 Thresholding Moments In t
252. s attenuation 7 8 examples 7 9 overview 7 2 truncation 7 9 lowpass FFT filters attenuation 7 6 examples 7 7 overview 7 2 truncation 7 6 mask FFT filters overview 7 2 purpose and use 7 10 overview 7 1 when to use 7 2 frequency processing 7 1 G gauging application See also dimensional measurements color pattern matching 14 18 edge detection 11 2 pattern matching 12 1 Gaussian filters example 5 26 kernel definition 5 26 predefined kernels A 7 geometric measurements 13 13 IMAQ Vision Concepts Manual Index gradient filters linear definition 5 15 edge extraction and edge highlighting 5 18 edge thickness 5 19 example 5 15 filter axis and direction 5 17 kernel definition 5 16 nonlinear definition 5 28 mathematical concepts 5 33 predefined kernels Prewitt filters A 1 Sobel filters A 2 Gradient palette 2 7 Gray palette 2 5 gray level values in Binary palette table 2 7 of pixels 1 1 grayscale images pixel encoding 1 4 transforming RGB to grayscale 1 20 grayscale morphology functions auto median concepts and mathematics 5 41 overview 5 39 basic concepts 5 36 closing opening and closing examples 5 38 overview 5 38 concepts and mathematics 5 39 dilation concepts and mathematics 5 40 erosion and dilation examples 5 36 overview 5 36 erosion concepts and mathematics 5 39 erosion and dilation examples 5 36 overview 5 36 IMAQ Vision Concepts Manual opening
253. s a pixel measurement Angles Table 10 6 lists the IMAQ Vision particle angle measurements The equations are given in radians The results are given in degrees that are mapped into the range 0 to 180 such that 0 lt 8 lt 180 IMAQ Vision Concepts Manual 10 14 ni com Table 10 6 Angles Chapter 10 Particle Measurements Measurement Definition Equation Orientation The angle of the line that passes 1 D2 through the particle Center of Mass zatan Y yy Axx about which the particle has the lowest moment of inertia Max Feret Diameter Orientation The angle of the Max Feret Diameter F F mar ee Fy P The Orientation angle is measured counterclockwise from the horizontal axis as shown in Figure 10 4 The value can range from 0 to 180 Angles outside this range are mapped into the range For example a 190 angle is considered to be a 10 angle 2 Orientation in Degrees 1 Line with Lowest Moment of Inertia 3 Horizontal Axis Figure 10 4 Orientation iy Note Refer to the Max Feret Diameter entry in Table 10 1 for an illustration of Max Feret Diameter Orientation National Instruments Corporation 10 15 IMAQ Vision Concepts Manual Chapter 10 Particle Measurements Ratios Table 10 7 lists the IMAQ Vision particle ratio measurements Table 10 7 Ratios Measurement Definition Equation Area Image Area Percentage
254. s all the video information that a monochrome television set requires The main advantage of the YIQ space National Instruments Corporation 1 19 IMAQ Vision Concepts Manual Chapter 1 Digital Images for image processing is that the luminance information Y is de coupled from the color information I and Q Because luminance is proportional to the amount of light perceived by the eye modifications to the grayscale appearance of the image do not affect the color information In Depth Discussion IMAQ Vision Concepts Manual There are standard ways to convert RGB to grayscale and to convert one color space to another The transformation from RGB to grayscale is linear However some transformations from one color space to another are nonlinear because some color spaces represent colors that cannot be represented in other spaces RGB to Grayscale The following equations convert an RGB image into a grayscale image on a pixel by pixel basis grayscale value 0 299R 0 587G 0 114B This equation is part of the NTSC standard for luminance An alternative conversion from RGB to grayscale is a simple average grayscale value R G B 3 RGB and HSL There is no matrix operation that allows you to convert from the RGB color space to the HSL color space The following equations describe the nonlinear transformation that maps the RGB color space to the HSL color space L 0 299R 0 587G 0 114B V2 J3 G B VI 2R G
255. s and Sectors The color sensitivity parameter determines the number of sectors the hue space is divided into Figure 14 2a shows the hue color space when luminance is equal to 128 Figure 14 2b shows the hue space divided into a number of sectors depending on the desired color sensitivity Figure 14 2c shows each sector divided further into a high saturation bin and a low saturation bin The saturation threshold determines the radius of the inner circle that separates each sector into bins Figure 14 3 shows the correspondence between the color spectrum elements and the bins in the color space The first element in the color spectrum array represents the high saturation part in the first sector the second element represents the low saturation part the third element the high saturation part of the second sector and so on If there are n bins in the color space the color spectrum array contains n 2 elements The last two components in the color spectrum represent the black and white color respectively National Instruments Corporation 14 3 IMAQ Vision Concepts Manual Chapter 14 Color Inspection Element 1 Element 2 Element 3 4 3 O O e n 0 Black Element n 1 White Element n 2 a b IMAQ Vision Concepts Manual Figure 14 3 Hue Color Space and the Color Spectrum Array Relationship A color spectrum with a larger number of bins elements represents the color information
256. s be liable for any damages arising out of or related to this document or the information contained in it EXCEPT AS SPECIFIED HEREIN NATIONAL INSTRUMENTS MAKES NO WARRANTIES EXPRESS OR IMPLIED AND SPECIFICALLY DISCLAIMS ANY WARRANTY OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE CUSTOMER S RIGHT TO RECOVER DAMAGES CAUSED BY FAULT OR NEGLIGENCE ON THE PART OF NATIONAL INSTRUMENTS SHALL BE LIMITED TO THE AMOUNT THERETOFORE PAID BY THE CUSTOMER NATIONAL INSTRUMENTS WILL NOT BE LIABLE FOR DAMAGES RESULTING FROM LOSS OF DATA PROFITS USE OF PRODUCTS OR INCIDENTAL OR CONSEQUENTIAL DAMAGES EVEN IF ADVISED OF THE POSSIBILITY THEREOF This limitation of the liability of National Instruments will apply regardless of the form of action whether in contract or tort including negligence Any action against National Instruments must be brought within one year after the cause of action accrues National Instruments shall not be liable for any delay in performance due to causes beyond its reasonable control The warranty provided herein does not cover damages defects malfunctions or service failures caused by owner s failure to follow the National Instruments installation operation or maintenance instructions owner s modification of the product owner s abuse misuse or negligent acts and power failure or surges fire flood accident actions of third parties or other events outside reasonable control Copyright Under the copyright laws th
257. s equal to the coefficient of the structuring element placed on top of it then the pixel Pp is set to 1 else the pixel Po is set to 0 e In other words if the pixels P define the exact same template as the structuring element then Po 1 else Po 0 Figures 9 16b 9 16c 9 16d and 9 16e show the result of three hit miss functions applied to the same source image represented in Figure 9 16a Each hit miss function uses a different structuring element which is specified above each transformed image Gray cells indicate pixels equal to 1 Figure 9 16 Hit Miss Function A second example of the hit miss function shows how when given the binary image shown in Figure 9 17 the function can locate various patterns specified in the structuring element The results are displayed in Table 9 4 National Instruments Corporation 9 15 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology Figure 9 17 Binary Image before Application of Hit Miss Function Table 9 4 Using the Hit Miss Function Strategy Structuring Element Resulting Image Use the hit miss function to locate pixels isolated in a background The structuring element on the right e x i extracts all pixels equal to 1 that are 00100 surrounded by at least two layers of 00000 pixels that are equal to 0 0000 0 Use the hit miss function to locate single pixel holes in particles The structurin
258. s in volts A default parameter value recorded in the driver In many cases the default input of a control is a certain value often 0 The number of values a pixel can take on which is the number of colors or shades that you can see in the image Branches of the skeleton of an object Determination of optical or photographic density For each gray level in a linear histogram the function gives the number of pixels in the image that have the same gray level Extracts the contours edge detection in gray level G 4 ni com digital image dilation distance calibration distance function distribution function driver dynamic range E edge edge contrast edge detection edge hysteresis edge steepness energy center entropy equalize function O National Instruments Corporation G 5 Glossary An image f x y that has been converted into a discrete number of pixels Both spatial coordinates and brightness are specified Increases the size of an object along its boundary and removes tiny holes in the object Determination of the physical dimensions of a pixel by defining the physical dimensions of a line in the image Assigns to each pixel in an object a gray level value equal to its shortest Euclidean distance from the border of the object For each gray level in a linear histogram the function gives the number of pixels in the image that are less than or equal to that gray level Software
259. s on multiple classes so you can create tertiary or higher level images The other four methods entropy metric moments and interclass variance are reserved for strictly binary thresholding techniques The choice of which algorithm to apply depends on the type of image to threshold Clustering This technique sorts the histogram of the image within a discrete number of classes corresponding to the number of phases perceived in an image The gray values are determined and a barycenter is determined for each class This process repeats until it obtains a value that represents the center of mass for each phase or class The automatic thresholding method most frequently used is clustering also known as multi class thresholding National Instruments Corporation 8 3 IMAQ Vision Concepts Manual Chapter 8 Thresholding Example of Clustering This example uses a clustering technique in two and three phases on an image Notice that the results from this function are generally independent of the lighting conditions as well as the histogram values from the image This example uses the following original image IMAQ Vision Concepts Manual 8 4 ni com Chapter 8 Thresholding Clustering in three phases produces the following image Entropy Based on a classical image analysis technique entropy is best for detecting particles that are present in minuscule proportions on the image For example this function would be suitable for
260. s shown in Figure 4 1 Lack of contrast A widely used type of imaging application involves inspecting and measuring counting parts of interest in a scene A strategy to separate the objects from the background relies on a difference in the intensities of both for example a bright part and a darker background In this case the analysis of the histogram of the image reveals two or more well separated intensity populations as shown in Figure 4 2 Tune your imaging setup until the histogram of your acquired images has the contrast required by your application The histogram is the function H defined on the grayscale range 0 k 255 such that the number of pixels equal to the gray level value k is H k Mx where k is the gray level value n is the number of pixels in an image with a gray level value equal to k and k n n is the total number of pixels in an image The following histogram plot reveals which gray levels occur frequently and which occur rarely TTTTTTTTTTTTTTTTTTTT k 255 T Grayscale Range IMAQ Vision Concepts Manual Figure 4 1 Histogram Plot 4 2 ni com Chapter 4 Image Analysis Two types of histograms can be calculated the linear and cumulative histograms In both cases the horizontal axis represents the gray level value that ranges from 0 to 255 For a gray level value k the vertical axis of the linear histogram indicates the number of pixels n set to the v
261. search One new pattern matching technique uses non uniform sampling Since most images contain redundant information using all the information in the image to match patterns is time intensive and inaccurate You can improve the speed and accuracy of the pattern matching tool by sampling the image and extracting a few points that represent the overall content of the image Figure 12 5 shows an example of good sampling techniques for pattern matching Figure 12 5a shows the original template or reference image The black dots on Figure 12 5b represent the points on the image that are used to represent the template Figure 12 5 Good Pattern Matching Sampling Techniques Many of the new techniques for pattern matching use the image edge information to provide information about the structure of the image The amount of information in the image is reduced to contain only significant data about the edges 12 6 ni com Chapter 12 Pattern Matching You can process the edge image further to extract higher level geometric information about the image such as the number of straight lines or circles present in the image Pattern matching is then limited to the matching of edge and or geometric information between the template and the image Figure 12 6 illustrates the importance of edges and geometric modeling Figure 12 6a shows the reference pattern Figure 12 6b shows the edge information in the pattern and Figure 12 6c shows the high
262. shape In many applications especially in the pharmaceutical and plastic industries objects are sorted according to color such as pills and plastic pellets Figure 14 9 shows an example of how to sort different colored candies Using color templates of the different candies in the image color location quickly locates the positions of the different candies O National Instruments Corporation 14 11 IMAQ Vision Concepts Manual Chapter 14 Color Inspection Figure 14 9 Sorting Candy by Color Information What to Expect from a Color Location Tool In automated machine vision applications the visual appearance of inspected materials or components changes because of factors such as orientation of the part scale changes and lighting changes The color location tool maintains its ability to locate the reference patterns despite these changes The color location tool provides accurate results during the following common situations pattern orientation and multiple instances ambient lighting conditions and blur and noise conditions IMAQ Vision Concepts Manual 14 12 ni com Chapter 14 Color Inspection Pattern Orientation and Multiple Instances A color location tool locates the reference pattern in an image even if the pattern in the image is rotated or scaled When a pattern is rotated or slightly scaled in the image the color location tool can detect the following The pattern in the image e T
263. ship weighting function is applied to both the color spectrums before computing the distance between them The weighting function compensates for some errors that may occur during the binning process in the color space The fuzzy color comparison approach provides a robust and accurate quantitative match score The match score ranging from 0 to 1000 defines the similarity between the color spectrums A score of zero represents no similarity between the color spectrums whereas a score of 1000 represents a perfect match Figure 14 7 shows the comparison process Template Fuzzy Weighting Function Color Spectrum Match Score 0 1000 Absolute Difference Image or Inspection Area Color Spectrum Fuzzy Weighting Function Figure 14 7 Comparing Two Spectrums for Similarity O National Instruments Corporation 14 9 IMAQ Vision Concepts Manual Chapter 14 Color Inspection Color Location When to Use IMAQ Vision Concepts Manual Use color location to quickly locate known color regions in an image With color location you create a model or template that represents the colors that you are searching Your machine vision application then searches for the model in each acquired image and calculates a score for each match The score indicates how closely the color information in the model matches the color information in the found regions Color can simplify a monochrome visual i
264. sing an imaging sensor converts an image into a discrete number of pixels The imaging sensor assigns to each pixel a numeric location and a gray level or color value that specifies the brightness or color of the pixel National Instruments Corporation 1 1 IMAQ Vision Concepts Manual Chapter 1 Digital Images Properties of a Digitized Image Image Resolution Image Definition IMAQ Vision Concepts Manual A digitized image has three basic properties resolution definition and number of planes The spatial resolution of an image is its number of rows and columns of pixels An image composed of m columns and n rows has a resolution of m x n This image has m pixels along its horizontal axis and n pixels along its vertical axis The definition of an image indicates the number of shades that you can see in the image The bit depth of an image is the number of bits used to encode the value of a pixel For a given bit depth of n the image has an image definition of 2 meaning a pixel can have 2 different values For example if n equals 8 bits a pixel can take 256 different values ranging from 0 to 255 Ifn equals 16 bits a pixel can take 65 536 different values ranging from 0 to 65 535 or from 32 768 to 32 767 Currently IMAQ Vision supports only a range of 32 768 to 32 767 for 16 bit images IMAQ Vision can process images with 8 bit 10 bit 12 bit 14 bit 16 bit floating point or color encoding The manner in which
265. sion you specify the color space associated with an image when you create the image IMAQ Vision supports the RGB and HSL color spaces If you expect the lighting conditions to vary considerably during your color machine vision application use the HSL color space The HSL color space provides more accurate color information than the RGB space when running color processing functions such as color matching color location and color pattern matching IMAQ Vision s advanced algorithms for color processing which perform under various lighting and noise conditions process images in the HSL color space National Instruments Corporation 1 13 IMAQ Vision Concepts Manual Chapter 1 Digital Images Concepts IMAQ Vision Concepts Manual If you do not expect the lighting conditions to vary considerably during your application and you can easily define the colors you are looking for using red green and blue use the RGB space Also use the RGB space if you want only to display color images but not process them in your application The RGB space reproduces an image as you would expect to see it IMAQ Vision always displays color images in the RGB space If you create an image in the HSL space IMAQ Vision automatically converts the image to the RGB space before displaying it For more information about using color images refer to Chapter 14 Color Inspection Because color is the brain s reaction to a specific visual stimulus color
266. sion subsamples the boundary points to approximate a smoother more accurate perimeter Boundary points are the corners of the pixels that make up the boundary Particle hole Contiguous region of zero valued pixels completely surrounded by pixels with nonzero values Refer to the Particle Holes section for more information Angle Degrees of rotation measured counter clockwise from the x axis such that 0 lt O lt 180 Equivalent Rect Rectangle with the same perimeter and area as the particle Equivalent Ellipse Ellipse with the same perimeter and area as the particle National Instruments Corporation 10 3 IMAQ Vision Concepts Manual Chapter 10 Particle Measurements Table 10 1 Particle Concepts Continued Concept Definition Max Feret Diameter Line segment connecting the two perimeter points that are the furthest apart 1 Max Feret Diameter Start Highest leftmost of the two points defining the Max Feret Diameter 2 Max Feret Diameter End Lowest rightmost of the two points defining the Max Feret Diameter Max Feret Diameter Orientation Particle Perimeter Max Feret Diameter akw Convex Hull Smallest convex polygon containing all points in the particle The following figure illustrates two particles shown in gray and their respective convex hulls the areas enclosed by black lines Max Horiz Segment Longest row of contiguous pixels in the part
267. spectrum of the region under consideration This color spectrum is then compared with the template s color spectrum to compute a match score The search step is divided into two phases First the software performs a coarse to fine search phase that identifies all possible locations even those with very low match scores The objective of this phase is to quickly find possible locations in the image that may be potential matches to the template information Because stepping through the image pixel by pixel and computing match scores is time consuming some techniques are used to speed up the search process The following techniques allow for a quick search Subsampling When stepping through the image the color information is taken from only a few sample points in the image to use for comparison with the template This reduces the amount of data used to compute the color spectrum in the image which speeds up the search process e Step size Instead of moving the template across the image pixel by pixel the search process skips a few pixels between the each color comparison speeding up the search process The step size indicates the number of pixels to skip For color location the initial step size can be as large as half the size of the template The initial search phase generates a list of possible match locations in the image In the second step that list is searched for the location of the best match using a hill climbing algorith
268. st forms for modeling and identification purposes and so forth Advanced Morphology Transforms Concepts 3 The advanced morphology functions are conditional combinations of fundamental transformations such as binary erosion and dilation The functions apply to binary images in which a threshold of 1 has been applied to particles and where the background is equal to 0 This section describes the following advanced binary morphology functions e Border e Hole filling e Labeling e Lowpass filters e Highpass filters e Separation e Skeleton e Segmentation e Distance e Danielsson e Circle Convex Hull Note In this section of the manual the term pixel denotes a pixel equal to 1 and the term particle denotes a group of pixels equal to 1 IMAQ Vision Concepts Manual Border Function The border function removes particles that touch the border of the image These particles may have been truncated during the digitization of the image and eliminating them helps to avoid erroneous particle measurements and statistics Hole Filling Function The hole filling function fills the holes within particles 9 22 ni com Chapter 9 Binary Morphology Labeling Function The labeling function assigns a different gray level value to each particle The image produced is not a binary image but a labeled image using a number of gray level values equal to the number of particles in the image plus the gray level 0 used in the ba
269. t An offset defines the coordinate position in the original image where you want to place the origin of the image mask Figure 1 5 illustrates the different methods of applying image masks Figure 1 5a shows the ROI in which you want to apply an image mask Figure 1 5b shows an image mask with the same size as the inspection image In this case the offset is set to 0 0 A mask image also can be the size of the bounding rectangle of the ROI as shown in Figure 1 5c where the offset specifies the location of the mask image in the reference image You can define this offset to apply the mask image to different regions in the inspection image a b C 1 Region of Interest 2 Image Mask Figure 1 5 Using an Offset to Limit an Image Mask National Instruments Corporation 1 11 IMAQ Vision Concepts Manual Chapter 1 Digital Images Figure 1 6 illustrates the use of a mask with two different offsets Figure 1 6a shows the inspection image and Figure 1 6b shows the image mask Figure 1 6c and Figure 1 6d show the results of a function using the image mask given the offsets of 0 0 and 3 1 respectively E Border pixels Pixels not affected by the mask E Pixels affected by the mask Figure 1 6 Effect of Applying a Mask with Different Offsets For more information about
270. t and floating point images dynamicRange dynamicMax dynamicMin f x represents the new value National Instruments Corporation 5 1 IMAQ Vision Concepts Manual Chapter 5 Image Processing The function scales f x so that f rangeMin dynamicMin and ftrangeMax dynamicMax f x behaves on rangeMin rangeMax according to the method you select In the case of an 8 bit resolution a LUT is a table of 256 elements The index element of the array represents an input gray level value The value of each element indicates the output value The transfer function associated with a LUT has an intended effect on the brightness and contrast of the image Example The following example uses the following source image In the linear histogram of the source image the gray level intervals 0 49 and 191 254 do not contain significant information 190 255 IMAQ Vision Concepts Manual Using the following LUT transformation any pixel with a value less than 49 is set to 0 and any pixel with a value greater than 191 is set to 255 The interval 50 190 expands to 1 254 increasing the intensity dynamic of the regions with a concentration of pixels in the gray level range 50 190 If x 0 49 F x 0 If x 191 254 F x 255 else F x 1 81 x x 89 5 F x 5 2 ni com Chapter 5 Image Processing The LUT transformation produces the following image T
271. t and mathematics 5 41 overview 5 39 proper opening function binary morphology 9 20 grayscale morphology concept and mathematics 5 40 overview 5 39 pyramidal matching 12 5 Q quality information for spatial calibration 3 12 quality score output of calibration function 3 12 quantities digital particles 10 14 ni com R Rainbow palette 2 6 Rake function 11 10 ratio measurements digital particles 10 16 ratio of equivalent ellipse axis measurement 10 16 rectangle measurements digital particles Equivalent Rect Diagonal 10 10 Equivalent Rect Long Side 10 10 Equivalent Rect Short Side 10 10 equivalent rectangle 10 12 Rectangle Ratio 10 12 rectangle ratio measurement digital particles 10 12 regions of interest calibration correction region 3 15 calibration ROI 3 11 defining 2 10 functions 2 1 types of contours table 2 9 when to use 2 8 related documentation xvi resolution definition 3 3 determining pixel resolution 3 3 field of view 3 4 sensor size and number of pixels in sensor 3 5 RGB color space basic concepts 1 15 RGB cube figure 1 16 transforming color spaces RGB and CIE L a b 1 24 RGB and CIE XYZ 1 21 RGB and CMY 1 25 RGB and HSL 1 20 RGB and YIQ 1 25 RGB to grayscale 1 20 National Instruments Corporation Index Roberts filter definition 5 29 mathematical concepts 5 33 ROL See regions of interest S saturation definition 1 17 detecting with histog
272. t locating the object in every inspection image In most cases the object under inspection is not positioned in the camera field of view consistently enough to use fixed search areas If the object is shifted or rotated within an image the search areas should shift and rotate with the object The search areas are defined relative to a coordinate system A coordinate system is defined by a reference point origin and a reference angle in the image or by the lines that make up its two axes When to Use Use coordinate systems in a gauging application when the object does not appear in the same position in every inspection image You also can use a coordinate system to define search areas on the object relative to the location of the object in the image National Instruments Corporation 13 3 IMAQ Vision Concepts Manual Chapter 13 Dimensional Measurements IMAQ Vision Concepts Manual Concepts All measurements are defined with respect to a coordinate system A coordinate system is based on a characteristic feature of the object under inspection which is used as a reference for the measurements When you inspect an object first locate the reference feature in the inspection image Choose a feature on the object that the software can reliably detect in every image Do not choose a feature that may be affected by manufacturing errors that would make the feature impossible to locate in images of defective parts You can restrict the reg
273. tching tool gives accurate results Pattern Orientation and Multiple Instances A pattern matching tool can locate the reference pattern in an image even 1f the pattern in the image is rotated or scaled When a pattern is rotated or scaled in the image the pattern matching tool can detect the following items in the image The pattern in the image e The position of the pattern in the image e The orientation of the pattern e Multiple instances of the pattern in the image if applicable Figure 12 2a shows a template image or pattern Figure 12 2b shows the template shifted in the image Figure 12 2c shows the template rotated in the image Figure 12 2d shows the template scaled in the image Figures 12 2b to 12 2d illustrate multiple instances of the template Figure 12 2 Pattern Orientation and Multiple Instances National Instruments Corporation 12 3 IMAQ Vision Concepts Manual Chapter 12 Pattern Matching Ambient Lighting Conditions The pattern matching tool can find the reference pattern in an image under conditions of uniform changes in the lighting across the image Figure 12 3 illustrates the typical conditions under which pattern matching works correctly Figure 12 3a shows the original template image Figure 12 3b shows the same pattern under bright light Figure 12 3c shows the pattern under poor lighting a b c Figure 12 3 Examples of Lighting Conditions Blur and Noise Conditions
274. tem of three primary colors XYZ in which all visible colors can be represented using a weighted sum of only positive values of X Y and Z Figure 1 9 shows the functions used to define the weights of the X Y and Z components The Y component gives the luminance information and the X and Z components give color information IMAQ Vision Concepts Manual 1 18 ni com Chapter 1 Digital Images Wavelength nm Figure 1 9 CIE Color Matching Function CIE L a b Color Space CIE 1976 L a b one of the CIE based color spaces is a way to linearize the perceptibility of color differences The nonlinear relations for L a and b mimic the logarithmic response of the eye CMY Color Space CM Y is another set of familiar primary colors cyan magenta and yellow CMY is a subtractive color space in which these primary colors are subtracted from white light to produce the desired color The CMY color space is the basis of most color printing and photography processes CMY is the complement of the RGB color space because cyan magenta and yellow are the complements of red green and blue YIQ Color Space The YIQ space is the primary color space adopted by the National Television System Committee NTSC for color TV broadcasting It is a linear transformation of the RGB cube for transmission efficiency and for maintaining compatibility with monochrome television standards The Y component of the YIQ system provide
275. ter or smooth the pixel intensities of an image Applications include noise filtering uneven background correction and gray level feature extraction O National Instruments Corporation 5 35 IMAQ Vision Concepts Manual Chapter 5 Image Processing Grayscale Morphology Concepts IMAQ Vision Concepts Manual The gray level morphology functions apply to gray level images You can use these functions to alter the shape of regions by expanding bright areas at the expense of dark areas and vice versa These functions smooth gradually varying patterns and increase the contrast in boundary areas This section describes the following gray level morphology functions e Erosion e Dilation e Opening e Closing e Proper opening e Proper closing Auto median These functions are derived from the combination of gray level erosions and dilations that use a structuring element Erosion Function A gray level erosion reduces the brightness of pixels that are surrounded by neighbors with a lower intensity The neighborhood is defined by a structuring element Dilation Function This function increases the brightness of each pixel that is surrounded by neighbors with a higher intensity The neighborhood is defined by a structuring element The gray level dilation has the opposite effect of the gray level erosion because dilating bright regions also erodes dark regions Erosion and Dilation Examples This example uses the following sou
276. ters A nonlinear filter replaces each pixel value with a nonlinear function of 1ts surrounding pixels Like the linear filters the nonlinear filters operate on a neighborhood Nonlinear Prewitt Filter The nonlinear Prewitt filter is a highpass filter that extracts the outer contours of objects It highlights significant variations of the light intensity along the vertical and horizontal axes Each pixel is assigned the maximum value of its horizontal and vertical gradient obtained with the following Prewitt convolution kernels Prewitt 0 Prewitt 12 1 0 1 1 1 1 1 0 1 0 0 0 1 0 1 1 1 1 Nonlinear Sobel Filter The nonlinear Sobel filter is a highpass filter that extracts the outer contours of objects It highlights significant variations of the light intensity along the vertical and horizontal axes Each pixel is assigned the maximum value of its horizontal and vertical gradient obtained with the following Sobel convolution kernels Sobel 16 Sobel 28 1 0 1 2 l 2 02 0 1 0 1 1 2 1 National Instruments Corporation 5 27 IMAQ Vision Concepts Manual Chapter 5 Image Processing IMAQ Vision Concepts Manual As opposed to the Prewitt filter the Sobel filter assigns a higher weight to the horizontal and vertical neighbors of the central pixel Nonlinear Prewitt and Nonlinear Sobel Example This example uses the following source image Both filters outline the contours of the objects Because of the differ
277. that controls a specific hardware device such as an IMAQ or DAQ device The ratio of the largest signal level a circuit can handle to the smallest signal level it can handle usually taken to be the noise level normally expressed in decibels Defined by a sharp change transition in the pixel intensities in an image or along an array of pixels The difference between the average pixel intensity before and the average pixel intensity after the edge Any of several techniques to identify the edges of objects in an image The difference in threshold level between a rising and a falling edge The number of pixels that corresponds to the slope or transition area of an edge The center of mass of a grayscale image See center of mass A measure of the randomness in an image An image with high entropy contains more pixel value variation than an image with low entropy See histogram equalization IMAQ Vision Concepts Manual Glossary erosion Euclidean distance exponential and gamma corrections exponential function FFT fiducial form Fourier spectrum Fourier transform frequency filters ft function gamma gauging IMAQ Vision Concepts Manual Reduces the size of an object along its boundary and eliminates isolated points in the image The shortest distance between two points in a Cartesian system Expand the high gray level information in an image while suppressing low gray level information
278. the internal representation of images in IMAQ Vision image borders image masks and color spaces Chapter 2 Display contains information about image display palettes regions of interest and nondestructive overlays Chapter 3 System Setup and Calibration describes how to set up an imaging system and calibrate the imaging setup so that you can convert pixel coordinates to real world coordinates O National Instruments Corporation 1 1 IMAQ Vision Concepts Manual Digital Images This chapter contains information about the properties of digital images image types and file formats the internal representation of images in IMAQ Vision image borders image masks and color spaces Definition of a Digital Image An image is a two dimensional array of values representing light intensity For the purposes of image processing the term image refers to a digital image An image is a function of the light intensity fix y where fis the brightness of the point x y and x and y represent the spatial coordinates of a picture element abbreviated pixel By convention the spatial reference of the pixel with the coordinates 0 0 is located at the top left corner of the image Notice in Figure 1 1 that the value of x increases moving from left to right and the value of y increases from top to bottom 0 0 F Ax y y Y Figure 1 1 Spatial Reference of the 0 0 Pixel In digital image proces
279. the smoothing filter but using a Gaussian kernel in the filter operation The blurring in a Gaussian filter is more gentle than a smoothing filter See gradient filter Extracts the contours edge detection in gray level values Gradient filters include the Prewitt and Sobel filters The brightness of a pixel in an image Increases the brightness of pixels in an image that are surrounded by other pixels with a higher intensity Reduces the brightness of pixels in an image that are surrounded by other pixels with a lower intensity An image with monochrome information Functions that perform morphological operations on a gray level image Hour Inverse of lowpass attenuation Removes or attenuates low frequencies present in the FFT domain of an image Emphasizes the intensity variations in an image detects edges or object boundaries and enhances fine details in an image Attenuates or removes truncates low frequencies present in the frequency domain of the image A highpass frequency filter suppresses information related to slow variations of light intensities in the spatial image Inverse of lowpass truncation Indicates the quantitative distribution of the pixels of an image per gray level value IMAQ Vision Concepts Manual Glossary histogram equalization histogram inversion histograph hit miss function hole filling function HSI HSL HSV hue VO image image border Image Browser
280. tial and Gamma Correction Examples The following series of illustrations presents the linear and cumulative histograms of an image after various LUT transformations The more the histogram is compressed the darker the image 3 Note Graphics on the left represent the original image graphics on the top right represent the linear histogram and graphics on the bottom right represent the cumulative histogram The following graphic shows the original image and histograms A Power Y transformation where Y 1 5 produces the following image and histograms National Instruments Corporation 5 7 IMAQ Vision Concepts Manual Chapter 5 Image Processing A Square or Power Y transformation where Y 2 produces the following image and histograms An Exponential transformation produces the following image and histograms IMAQ Vision Concepts Manual Equalize The Equalize function is a lookup table operation that does not work on a predefined LUT Instead the LUT is computed based on the content of the image where the function is applied The Equalize function alters the gray level values of pixels so that they become evenly distributed in the defined grayscale range 0 to 255 for an 8 bit image The function associates an equal amount of pixels per constant gray level interval and takes full advantage of the available shades of gray Use this transformation to incre
281. tion is a line that is fit to the strongest subset of the points after ignoring the outlying points as shown in Figure 13 12 1 Edge Points 2 Standard Line Fit 3 IMAQ Vision Fit Line Figure 13 10 Data Set and Fitted Line Using Two Methods IMAQ Vision Concepts Manual 13 14 ni com Chapter 13 Dimensional Measurements 1 Perpendicular Distance from an 2 Line Fit Edge Point to the Line 3 Points Used to Fit the Line Figure 13 11 Calculation of the Mean Square Distance MSD The pixel radius minimum score and maximum iteration parameters control the behavior of the line fit function The pixel radius defines the maximum distance allowed in pixels between a valid point and the estimated line The algorithm estimates a line where at least half the points in the set are within the pixel radius If a set of points does not have such a line the function attempts to return the line that has the most number of valid points O National Instruments Corporation 13 15 IMAQ Vision Concepts Manual Chapter 13 Dimensional Measurements 1 Strongest Line Returned by the Line Fit Function 2 Alternate Line Discarded by the Line Fit Function IMAQ Vision Concepts Manual Figure 13 12 Strongest Line Fit Increasing the pixel radius increases the distance allowed between a point and the estimated line Typically you can use the imaging system resolution and the amount of noise in your
282. tool 14 13 color pattern matching 14 20 pattern matching 12 4 analysis of images See image analysis AND operator table 6 3 angle measurements digital particles 10 14 area measurements digital particles area of particle 10 13 convex hull area 10 13 holes area 10 13 image area 10 13 Particle amp Holes Area 10 13 arithmetic operators 6 2 attenuation highpass FFT filters 7 8 lowpass FFT filters 7 6 automatic thresholding clustering examples 8 4 in depth discussion 8 7 overview 8 3 O National Instruments Corporation entropy in depth discussion 8 7 overview 8 5 interclass variance in depth discussion 8 8 overview 8 5 metric in depth discussion 8 8 overview 8 5 moments in depth discussion 8 9 overview 8 5 overview 8 3 techniques 8 6 auto median function binary morphology 9 21 grayscale morphology concept and mathematics 5 41 overview 5 39 Average Horiz Segment Length digital particles 10 10 Average Vert Segment Length digital particles 10 10 axis of symmetry gradient filter 5 17 barcode reader algorithm limits 15 3 purpose and use 15 3 binary morphology advanced binary functions basic concepts 9 22 border 9 22 circle 9 29 convex hull 9 30 Danielsson 9 28 IMAQ Vision Concepts Manual Index IMAQ distance 9 28 hole filling 9 22 labeling 9 23 lowpass and highpass filters 9 23 segmentation 9 26 separation 9 24 skeleton 9 25 when to use 9 21 conn
283. touch each other by narrow isthmuses Finds objects in an image whose shape matches the shape of the object specified by a shape template The matching process is invariant to rotation and can be set to be invariant to the scale of the objects A pattern matching technique in which the reference pattern can be located anywhere in the test image but cannot be rotated or scaled A highpass filter that outlines edges Applies a succession of thinning operations to an object until its width becomes one pixel Obtains lines in an image that separate each object from the others and are equidistant from the objects that they separate Blurs an image by attenuating variations of light intensity in the neighborhood of a pixel Extracts the contours edge detection in gray level values using a 3 x 3 filter kernel Assigning physical dimensions to the area of a pixel in an image Alter the intensity of a pixel with respect to variations in intensities of its neighboring pixels You can use these filters for edge detection image enhancement noise reduction smoothing and so forth The number of pixels in an image in terms of the number of rows and columns in the image IMAQ Vision Concepts Manual Glossary square function square root function standard representation statute mile structuring element subpixel analysis T template thickening thinning threshold threshold interval TIFF Tools palette trut
284. tructuring element and a hexagonal frame mode the transformation does not consider the elements 2 0 and 2 2 when calculating the effect of the neighbors on the pixel being processed If a morphological function uses a 5 x 5 structuring element and a hexagonal frame mode the transformation does not consider the elements 0 0 4 0 4 1 4 3 0 4 and 4 4 Figure 9 4 illustrates a morphological transformation using a 3 x 3 structuring element and a rectangular frame mode Structuring Element Image 0 1 0 P1 P2 Ps 1 11 1 x Pa Po Ps P P T Po Pas Pas Ps P7 o 1 o Pe Pz Pe Figure 9 4 Transformation Using a 3 x 3 Structuring Element and Rectangular Frame Figure 9 5 illustrates a morphological transformation using a 3 x 3 structuring element and a hexagonal frame mode P p o T Po Po Pa Pas Pe Figure 9 5 Transformation Using a 3 x 3 Structuring Element and Hexagonal Frame Table 9 1 illustrates the effect of the pixel frame shape on a neighborhood given three structuring element sizes The gray boxes indicate the neighbors of each black center pixel O National Instruments Corporation 9 5 IMAQ Vision Concepts Manual Chapter 9 Binary Morphology Table 9 1 Pixel Neighborhoods Based on Pixel Frame Shapes Structuring Element Size Square Pixel Frame Hexagonal Pixel Frame 3x3 5x5 1x7
285. ts and noise The histogram is a fundamental image analysis tool that describes the distribution of the pixel intensities in an image Use the histogram to determine if the overall intensity in the image is high enough for your inspection task You can use the histogram to determine whether an image contains distinct regions of certain grayscale values You also can use a histogram to tune the image acquisition conditions You can detect two important criteria by looking at the histogram e Saturation Too little light in the imaging environment leads to underexposure of the imaging sensor while too much light causes overexposure or saturation of the imaging sensor Images acquired under underexposed or saturated conditions will not contain all the information that you want to inspect from the scene being observed It is important to detect these imaging conditions and correct for them O National Instruments Corporation 4 1 IMAQ Vision Concepts Manual Chapter 4 Image Analysis Histogram Concepts during setup of your imaging system You can detect whether a sensor is underexposed or saturated by looking at the histogram An underexposed image contains a large number of pixels with low gray level values This appears as a peak at the lower end of the histogram An overexposed or saturated image contains a large number of pixels with very high gray level values This condition is represented by a peak at the upper end of the histogram a
286. ts Corporation v IMAQ Vision Concepts Manual Contents RGB and CMY cuisine dieta said 1 25 RGB anid OD iia rs 1 25 Chapter 2 Display Tra SE IDS Play id tt teens 2 1 Image Display Concepts coa 2 1 When to Us cs esses tector echoes sain e eatin Say 2 1 In Depth Discussions ii ce sccceceesessectvagcescaussaveands ontcvveacbedonslevtuasvanouvsaouveesenctects 2 2 Display Modes csi ti on A OS 2 2 Mapping Methods for 16 Bit Image Display ooooncnnnnnnnnnnnnonnncninnnns 2 2 Palette Syste das ni a att 2 4 When to Use tii aii hha etl es Si deals 2 4 CONCEDES oes A A esti ea tata 2 4 In Depth Discussion oc ARAS 2 5 Gray Palette cintia att 2 5 Temperature Paleta dt 2 6 Rainbow Paleta tte 2 6 Gradient Palette iii ni 2 7 Binary Paleta 2 7 Regions of Interests S a aii 2 8 When 10 Us cdt AA 2 8 ROT Concepts uti an eee ee a 2 9 Nondestructive Overa yonca E E ii eee 2 10 When to Use ciclos 2 10 Nondestructive Overlay Concepts ooooccccoccconcoocconcnncnoncnnonononanonncnno non cono rnncnncrnoo 2 11 Chapter 3 System Setup and Calibration Setting Up Your Imaging System ssr sissies ir Ee i aei 3 1 Acquiting Quality mages i oroar e e E ESE 3 3 Resolution n A a xis E A e s 3 3 Contrast aras oo eta ra dla 3 5 Depthor Fieldsi E aE ER 3 5 Pers pecto E R E REA 3 5 DIStO MON ita es 3 7 Spatial Calibre en e ee e a a e e a AEE e i eaka 3 7 When to Useisiin e a vl E E E RTTE 3 7 Concepts coi A a a ias 3 8 Calibration Procesal da lit 3 8 Coordin
287. ts detected in the image e The area of a polygon specified by its vertex points The line that fits to a set of points and the equation of that line e The circle that fits to a set of points and its area perimeter and radius The ellipse that fits to a set of points and its area perimeter and the lengths of its major and minor axis e The intersection point of two lines specified by their start and end points The line bisecting the angle formed by two lines The line midway between a point and a line that is parallel to the line e The perpendicular line from a point to a line which computes the perpendicular distance between the point and the line Line Fitting The line fitting function in IMAQ Vision uses a robust algorithm to find a line that best fits a set of points The line fitting function works specifically with the feature points obtained during gauging applications In a typical gauging application a rake or a concentric rake function finds a set of points that lie along a straight edge of the object In an ideal case all the detected points would make a straight line However the points usually do not appear in a straight line for one of the following reasons e The edge of the object does not occupy the entire search region used by the rake e The edge of the object is not a continuous straight line Noise in the image causes points along the edge to shift from their true positions Figure 13 10 shows
288. ty in the neighborhood of a pixel It smooths the overall shape of objects and attenuates details It is similar to a smoothing filter but its blurring effect is more subdued O National Instruments Corporation 5 25 IMAQ Vision Concepts Manual Chapter 5 Image Processing Given the following source image a Gaussian filter produces the following image Kernel Definition A Gaussian convolution filter is an averaging filter and its kernel uses the model a Sa aka a Soa where a b c and d are integers and x gt 1 The coefficients of a Gaussian convolution kernel of a given size are the best possible approximation using integer numbers of a Gaussian curve For example 3x3 5x5 1 2 1 1 2 42 1 2 4 2 2 4 8 4 2 12 1 4 8 16 8 4 24 8 4 2 12 42 1 Because all the coefficients in a Gaussian kernel are positive each pixel becomes a weighted average of its neighbors The stronger the weight of a IMAQ Vision Concepts Manual 5 26 ni com Chapter 5 Image Processing neighboring pixel the more influence it has on the new value of the central pixel Unlike a smoothing kernel the central coefficient of a Gaussian filter is greater than 1 Therefore the original value of a pixel is multiplied by a weight greater than the weight of any of its neighbors As a result a greater central coefficient corresponds to a more subtle smoothing effect A larger kernel size corresponds to a stronger smoothing effect Nonlinear Fil
289. uces significant differences between the pixel values in the border and the image pixels along the border which causes the function to detect erroneous edges along the border of the image If you are using an edge detection function copy or mirror the pixel values along the border into the border region to obtain more accurate results National Instruments Corporation 1 9 IMAQ Vision Concepts Manual Chapter 1 Digital Images In IMAQ Vision most image processing functions that use neighbors automatically set pixel values in the image border using neighborhoods The grayscale filtering operations low pass Nth order and edge detection use the mirroring method to set pixels in the image border The binary morphology grayscale morphology and segmentation functions copy the pixel values along the border into the border region The correlate circles reject border remove particles skeleton and label functions set the pixel values in the border to zero My Note The border of an image is taken into account only for processing The border is never displayed or stored in a file Image Masks When to Use Concepts IMAQ Vision Concepts Manual An image mask isolates parts of an image for processing If a function has an image mask parameter the function process or analysis depends on both the source image and the image mask An image mask is an 8 bit binary image that is the same size as or smaller than the inspection image Pixe
290. unction F to the image followed by the function G to the image National Instruments Corporation 5 41 IMAQ Vision Concepts Manual Operators This chapter contains information about arithmetic and logic operators which mask combine and compare images Introduction Operators perform basic arithmetic and logical operations on images Use operators to add subtract multiply and divide an image with other images or constants You also can perform logical operations such as AND NAND OR NOR and XOR XNOR and make pixel comparisons between an image and other images or a constant When to Use Common applications of these operators include time delayed comparisons identification of the union or intersection between images correction of image backgrounds to eliminate light drifts and comparisons between several images and a model You also can use operators to threshold or mask images and to alter contrast and brightness Operator Concepts An arithmetic or logic operation between images is a pixel by pixel transformation It produces an image in which each pixel derives its value from the values of pixels with the same coordinates in other images If A is an image with a resolution XY B is an image with a resolution XY and Op is the operator then the image N resulting from the combination of A and B through the operator Op is such that each pixel P of the resulting image N is assigned the value Pn Pa Op P
291. und When to Use Pattern matching algorithms are some of the most important functions in image processing because of their use in varying applications You can use pattern matching in the following three general applications Alignment Determines the position and orientation of a known object by locating fiducials Use the fiducials as points of reference on the object e Gauging Measures lengths diameters angles and other critical dimensions If the measurements fall outside set tolerance levels the component is rejected Use pattern matching to locate the object you want to gauge e Inspection Detects simple flaws such as missing parts or unreadable printing The pattern matching tools in IMAQ Vision measure the similarity between an idealized representation of a feature called a model or template and a feature that may be present in an image A feature is defined as a specific pattern of pixels in an image National Instruments Corporation 12 1 IMAQ Vision Concepts Manual Chapter 12 Pattern Matching Pattern matching is the key to many applications Pattern matching provides your application with information about the number of instances and location of the template within an image For example you can search an image containing a printed circuit board for one or more alignment marks fiducials The machine vision application uses the marks to align the board for chip placement from a chip mounting devic
292. uring the scaling process to generate the new image When you learn for correction you have the option of constructing a correction table The correction table is a lookup table stored in memory that contains the real world location information of all the pixels in the National Instruments Corporation 3 13 IMAQ Vision Concepts Manual Chapter 3 IMAQ Vision Concepts Manual System Setup and Calibration image The lookup table greatly increases the speed of image correction but requires more memory and increases your learning time Use this option when you want to correct several images at a time in your vision application Correcting images is a time intensive operation You may be able to get the measurements you need without image correction For example you can use IMAQ Vision particle analysis functions to compute calibrated measurements directly from an image that contains calibration information but has not been corrected Also you can convert pixel coordinates returned by edge detection tools into real world coordinates Refer to Chapter 5 Machine Vision of your IMAQ Vision user manual for more information Scaling Mode The scaling mode defines how to scale a corrected image Two scaling mode options are available scale to fit and scale to preserve area Figure 3 9 illustrates the scaling modes Figure 3 9a shows the original image With the scale to fit option the corrected image is scaled to fit in an image the same
293. ution of the pixels along an ROI in an image Defines the range of gray level values in an object of an image Characterizes an object based on the range of gray level values in the object If the intensity range of the object falls within the user specified range it is considered an object Otherwise it is considered part of the background The technique used to find values in between known values when resampling an image or array of pixels A relative unit of measure named for the Institute of Radio Engineers 0 IRE corresponds to the blanking level of a video signal 100 IRE to the white level Notice that for CCIR PAL video the black level is equal to the blanking level or 0 IRE while for RS 170 NTSC video the black level is at 7 5 IRE Joint Photographic Experts Group Image file format for storing 8 bit and color images with lossy compression extension JPG G 10 ni com K kernel L labeling LabVIEW Laplacian filter line gauge line profile linear filter logarithmic and inverse gamma corrections logarithmic function logic operators lossless compression O National Instruments Corporation Glossary Structure that represents a pixel and its relationship to its neighbors The relationship is specified by weighted coefficients of each neighbor The process by which each object in a binary image is assigned a unique value This process is useful for identifying the number of objects in the imag
294. ws the following features e Original particles in black e Segments in shades of gray Skiz lines Figure 9 27 Segmentation with Skiz Lines Distance Function The distance function assigns a gray level value to each pixel equal to the shortest distance to the particle border That distance may be equal to the distance to the outer particle border or to a hole within the particle Y Tip Use the Danielsson function instead of the distance function when possible IMAQ Vision Concepts Manual Danielsson Function The Danielsson function also creates a distance map but is a more accurate algorithm than the classical distance function Because the destination image is 8 bit its pixels cannot have a value greater than 255 Any pixels with a distance to the edge greater than 255 are set to 255 For example a circle with a radius of 300 yields 255 concentric rings whose pixel values range from 1 to 255 from the perimeter of the circle inward The rest of the circle is filled with a solid circle whose pixel value is 255 and radius is 45 Figure 9 28a shows the source threshold image used in the following example The image is sequentially processed with a lowpass filter hole filling and the Danielsson function The Danielsson function produces the distance map image shown in Figure 9 28b 9 28 ni com Chapter 9 Binary Morphology Figure 9 28 Danielsson Function View this final image with a binary palette In t
295. xel profile to define the edge strength at that point To locate an edge point the software scans the pixel profile pixel by pixel from the beginning to the end A rising edge is detected at the first point at which the pixel value is greater than a threshold value plus a hysteresis value Set this threshold value to define the minimum edge strength required for qualifying edges Use the hysteresis value to declare different edge strengths for the rising and falling edges When a rising edge is detected the software looks for a falling edge A falling edge is detected when the pixel value falls below the specified threshold value This process is repeated until the end of the pixel profile The first edge along the profile can be either a rising or falling edge Figure 11 8 shows the simple edge model The simple edge detection method works very well when there is little noise in the image and when there is a distinct demarcation between the object and the background Gray Level Intensities 4 met 1 Grayscale Profile 4 Rising Edge Location 2 Threshold Value 5 Falling Edge Location 3 Hysteresis Figure 11 8 Simple Edge Detection Advanced Edge Detection To compute the edge strength at a given point along the pixel profile the software averages pixels before and after the analyzed point The pixels that are averaged after the point can be a certain pixel distance from the National Instruments Corp
296. xels Always choose an origin location that lies within the calibration grid so that you can convert the location to real world units e Specify the angle as the angle between the new coordinate system and the horizontal direction in the real world If your imaging system has perspective errors but no lens distortion this angle can be visualized as shown in Figure 3 11 However if your images exhibit nonlinear distortion visualizing the coordinate system in the image is not trivial 3 10 ni com Chapter 3 System Setup and Calibration Calibration Algorithms IMAQ Vision has two algorithms for calibration perspective and nonlinear Perspective calibration corrects for perspective errors and nonlinear calibration corrects for perspective errors and nonlinear lens distortion Learning for perspective is faster than learning for nonlinear distortion The perspective algorithm computes one pixel to real world mapping for the entire image You can use this mapping to convert the coordinates of any pixel in the image to real world units The nonlinear algorithm computes pixel to real world mappings in a rectangular region centered around each dot in the calibration grid as shown in Figure 3 8 IMAQ Vision estimates the mapping information around each dot based on its neighboring dots You can convert pixel units to real world units within the area covered by the grid dots Because IMAQ Vision computes the mappings around each dot only the ar
297. y the A Radius particle perimeter P Waddel Disk Diameter of a disk with the same 2 A Diameter area as the particle T IMAQ Vision Concepts Manual 10 10 ni com Chapter 10 Particle Measurements Ellipses Equivalent Ellipse Major Axis Total length of the major axis of the ellipse that has the same area and same perimeter as a particle This length is equal to 2a This definition gives the following set of equations Area tab Perimeter Tila b where a 1 2 Eza b 1 2 E Esa For a given area and perimeter only one solution a b exists Equivalent Ellipse Minor Axis Total length of the minor axis of the ellipse that has the same area and same perimeter as a particle This length is equal to 2b e Ellipse Ratio Ratio of the major axis of the equivalent ellipse to its minor axis which is defined as ellipse major axis _ a ellipse minor axis b The more elongated the equivalent ellipse the higher the ellipse ratio The closer the equivalent ellipse is to a circle the closer to 1 the ellipse ratio O National Instruments Corporation 10 11 IMAQ Vision Concepts Manual Chapter 10 Particle Measurements Rectangles Equivalent Rect Long Side Length of the long side R of the rectangle that has the same area and same perimeter as a particle This definition gives the following set of equations A Area R R P Perimeter 2 R R8 N
Download Pdf Manuals
Related Search
Related Contents
Samsung RSJ1JUPS Instrukcja obsługi De'Longhi DCP400 User's Manual AT868 (without Keypad 1-Ch only) Copyright © All rights reserved.
Failed to retrieve file