Home
CAN-EYE USER MANUAL
Contents
1. Front Nail Optical centre ha A r alignement Right corner D a E i AA SCIENCE amp IMPACT Perpandicular ruler Left lateral ruler Right lateral ruler Figure 26 Example of an image of the experimental design taken with the hemispherical camera and used for the calibration of the projection function The horizontal doted yellow line corresponds to the diameter of the image passing through the optical centre defined by its coordinates as measured previously The camera is aligned thanks to the front nail and background line This process allows computing the coefficient a that relates the radius in pixel in the image to a corresponding viewing direction in degrees Then the maximum field of view of the cameratfish eye lens system can be computed by fitting a circle to the image Figure 27 Once the Projection Function sheet is filled the user must run CAN EYE go in the Calibration Menu and then select Projection Function Can Eye asks then to select the excel file computes automatically the projection function parameters and fill the Results sheet CAN EYE user manual Page 45 CAN EYE USER MANUAL INNA SCIENCE amp IMPACT P1010143 JPG Projection Function RC 0 0532R ye RMSE 1 76 R aR_2 bR_ RMSE 1 30 pix pix 1000 a 0 0069 b 0 0466 800 amp Actual Proj Func Modeled Polynomial of order 1 Modeled Polynomial of order 2 F
2. CAN EYE USER MANUAL gt INRA SCIENCE amp IMPACT CAN EYE V6 313 USER MANUAL Date May 2010 Updated May 2014 Contributing Authors M Weiss F Baret CAN EYE USER MANUAL NYA SCIENCE amp IMPACT CONTENT CAN EYE V6 313 USER MANUAL hjem b de dkeeeeandinasddbnespr i L INTRODUC TION e E AEE EE E E EE EE 5 1 1 CAN EYE specific features s rrrrrrrrrrrnnnnnrrrrrrrrnrrnnnnnnrrrrrrrnnrnnnnnnnenrrrrnnrnnnnnneeensneenne 5 MN 6 Lids rare and software requirements ecserin eii 9 1 4 Copyright and limitations of responsibility ccccceesesessseessssssesssssssessseseeeeeeees 9 SN NN 9 Eo Dor TOUT EEE 9 NITTI 9 UNG NEEB IPE e AS 10 3 1 Choosing the image type to be processed rrrrnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn 10 m ep VCs 0 Bye Cer Bn EEE ere 12 Images acquired with a camera inclined at 57 59 mnrrrrrrrrnnnnnnnnnnnnnnrrrnnnnnnnvrvvnnnnnnnnnnnnnnnnnnnnr 13 Images acquired with a camera at the vertical position Nadir rrrnnnnnnnnnnnnnnnnnnnnnnnnnnnn 14 3 2 Defining the processing parameters ccccceeeesssssessssssesssesseessesssssesseseseeeeseeeeeegs 15 Hos Pat aC eis HOE DU aoa crass canng E E E es tence eect er caste 15 Processing Parameters TOF Imanes Al Save 18 Processing parameters for images acquired at nad r rnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn 20 33 Pe LEC TING PERTINENT NAG vred 21 RNA 22 MEG 23 SELE
3. for the image name effective PAI True PAI for different LAISAT value see LAItrue Processing date Images acquired at Nadir Two sheets are generated CAN EYE user manual Page 37 CAN EYE USER MANUAL NYA T SCIENCE amp IMPACT CAN EYE USER MANUAL NRA gt Mer columns contents are in order directory name CE NADIR file name average FCOVER value gt columns contents are in order directory name CE NADIR file name image name individual FCOVER value for the corresponding image Processing date 6 CALIBRATION MODULE DHP only Optical systems are not perfect and at least two main characteristics are required to perform an accurate processing of hemispherical images e The coordinates of the optical centre e The projection function In Can Eye the projection function is assumed to be a polar projection angular distances in degrees in the object region are proportional to radial distances in pixel on the image plane Because in some situations the focal length may be manipulated by acting on the zoom system the projection function must be also known for each focal length used A simple method is proposed by the CAN EYE calibration menu to characterize these features Figure 17 It requires to fill in an excel file located in the Calibration Data directory This file contains 4 sheets named Start Optical Centre Projection Function and Results Inside each s
4. 1 1 A 1 4 4 27 15 EE a EAT 28 16 mm 17 235 1 6 29 17 52 1 18 265 7 4 30 18 48 119 1 265 1 8 4 31 20 46 1 A 1 29 9 32 21 EE EE EE A U _ 33 22 OC e Y O OO l E 34 35 S E S A A er 36 E a E VE 37 e 38 E e e a 39 O G A A A 40 11 18 4 41 EE OE EE GE Se 42 EE ee es A eee 43 mH eee NE VE ME Hu EE EE EE EE KE 45 EEE GET I EEE E eet 46 EE ES ee AT C Pal O OOO MH AK RO Start Optical Centre Projection Function Results Figure 24 Example of a projection function sheet of the calibration excel file CAN EYE user manual Page 43 CAN EYE USER MANUAL NYA meg ne SCIENCE amp IMPACT Let us assume that the two images are named Im 1 4 and Im2 HM Im1 must be the image for which ruler ticks are the most readable Then you have to look at the image using an image processing software e g Paintshop Microsoft Photo Editor to read pixels coordinates in the image eRead the optical centre position in cm on the rulers of Iml and Im2 and fill cells B4 C4 and B5 C5 of the excel file Projection Function sheet eThe quantity A can be easily measured by looking at one direction on the lateral ruler Xp1 Yp1 cells C10 D10 on Im1 reading the corresponding value h in cm cell E10 for distance H Then for the distance H A the same point on Im2 corresponds to a value h cell E11 on the lateral ruler It comes simply that A h h eOn the perpendicular ruler select t
5. 1 5cm thick from which a 30 30cm square is excavated from the middle of one of the side The three sides of this gap were equipped with 30cm long rulers The camera is set horizontally as well as the experimental design The camera is aligned along the main axis of the design using the front nail and background line Hemispherical photographs are taken at two distances H and H H A from the centre of the design and along the optical axis The calibration excel file sheet is shown in Figure 25 Note that it is required to have run the optical centre menu before being able to compute the projection function CAN EYE user manual Page 42 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT CAN EYE USER MANUAL NRA SCIENCE amp IMPACT B13 i f A B C D Be careful that the first image name MUST be the one for which all the readings on the right left and per 2 Enter the value in cm that corresponds to the X Y coordinate of the optical center on the two images acqu EE Kr EE P1010143 JPG 1392 1158 P1010141 JPG 1392 1158 18 19 Readings on each ruler the X coordinates corresponding to given values in cm for one of the 2 images ac 20 21 LEFT Ruler Right Mer Perp andicular Ru P Cnt 5 22 ue cm P1010143 JPG Value cm P10 23 10 Vey EEE DN RSS SSE Te er ee 24 11 ja sm d ee eel 25 12 78 1 11 1 28 1 3 4 26 13 9 1
6. MCR in your environment path o Either by opening a command prompt click on Start Menu then execute type cmd When the DOS windows opens type set PATH C Program Files MATLAB MATLAB Component Runtime v717 PATH o Or select the My Computer icon on your desktop Right click the icon and select Properties from the menu Select the Advanced tab Click on Environment Variables Your environment variables are listed Add C Program Files MATLAB MATLAB Component Runtime v717 in the path variable gt Create a directory that will contain the CAN EYE executable file and the associated directories Help Data Param_V6 gt Copy CAN EYE V6313_yyyy_mm_dd_bits exe in this directory Then click twice on CAN EYE V6313_yyyy_mm_dd_bits exe It will install all the CAN EYE files and associated directories You should be now ready to launch CAN EYE by clicking on CAN EYE VXXX exe 3 USING CAN EYE STEP BY STEP 3 1 Choosing the image type to be processed Series of images either jpg or tiff format to be processed at once must be stored in a same directory Figure 2 These images are assumed to correspond to the same ESU Elementary Sampling Unit inside a field All the images must have the same characteristics i e the same format the same size and Note that considering the assumptions made in CAN EYE Poisson model it is not correct to estimate the LAI from the gap fraction evaluated on a single image A minimu
7. as the sky in other part of the image while the user knows that these pixels belong to leaves When the user is pleased with his classification and clicks on DONE CAN EYE processes the images to derive the different output variables LAI ALA FAPAR FCOVER Figures are displayed on the screen and are saved in the output directory 4 CAN EYE OUTPUT DESCRIPTION The following sections describe CAN EYE outputs as well as the theoretical background allowing the estimations Table 2 presents the variables that CAN EYE derives from the set of digital images Effective Plant Area Index estimated PAIS7 x X from P 57 Effective Plant area index PAIeff le koi Effective average plant inclination ALAeff NN angle True plantarea index PAltre x x True average leaf inclination angle ALAtrue x 1 Climbing Fagg or fax Cover Fraction JFOOVER x x gt Instantaneous black sky fAPAR FAPAR x l CAN EYE user manual Page 27 CAN EYE USER MANUAL NYA aa SCIENCE amp IMPACT White sky fAPAR Px of Table 2 CAN EYE output variable acronyms that can be derived from different acquisitions hemispherical images images at 57 and images acquired at nadir Click on the link to directly access to the paragraph describing the variable computation Adit iiis Gan Berc H 4 1 Definitions and theoretical background Introduction Leaf area index indirect measurement te
8. belonging to the other class sky or soil if the vegetation was first selected vegetation if the sky or soil was first selected It is also possible to select two classes In this case all the pixels that are not allocated to one or the other class are considered mixed and processed later as such CAN EYE user manual Page 6 At the beginning of this classification process the user can use different indices to roughly classify the images by thresholding methods Figure 1 step 4 and can then interactively refine his classification Figure 1 step 5 Once the allocation of the colours to the defined classes is completed the images are transformed into binarised images Generation of the outputs This last step is automatic and do not require interaction with the user The images are binarised using the classification results Figure 1 step 6 A series of files are produced and stored in a directory located where the initial images were stored The outputs Figure 1 step 7 include a report file html format where all the elements of the process are described Additional text files EXCEL or ASCII format are also created where gap fractions computed LAI ALA fAPAR and FCOVER are created depending on the type of processed images Optionally intermediate results matlab file format can also be stored allowing CAN EYE to run in batch mode to perform new or additional processing Note that there is also a possibility to summarize the r
9. eA 21 Feie 11 Main maskine Wl OW sree netsscassndnsancecubiavweacdenenusnactessadaaasaniebelswvbarseacebendenabiacssathoaaa 22 Figure 12 Gamma correction value Masking PLOCESS ccccccccccccecceceeceeeeeeeeeeceeeeeceeeeeeeeeeees 23 Figure 13 Secondary masking window after having selected the image to be masked 23 Figure 14 Applying a mask to all 1mages ccccccssssssccccceeeeeeeeeeeeeesseseeessssnsaeeeeeeeeeeeseeeeeees 24 Figure 15 Classificanon Wy 1 COW EE ty amanetuaneobess 26 Figure 16 CAN EYE Images at nadir 0 menu eeooooonnrnnnnorrnnnnnnnvrrrnnnnnnnnnnnnnnnnssnrnnnnnnnnsnn 37 TS NIA 38 Figure 18 Example of the Start sheet of the calibration excel file rrrrrrrrrrrrrrrrrrrrnrnnnrrn 38 Petre 19 Image coordinate Syste sscscaaseccassancnnesann cian cgessancaeusacenncnmneenaueiantasinatamneacncneaenckews 39 Figure 20 Illustration of the holes drilled in the fish eye cap The red arrow indicates the OLE KO OE OG WAS CAD NE 40 Figure 21 A series of images taken for several positions of the fish eye cap In this case three EN TL 40 Figure 22 Example of an excel file sheet to be filled to determine the optical centre of a By EEE EE ng teed E a een oa eeaetetatees 41 Figure 23 Example of a CAN EYE output showing the fitting of the circles to the holes positions in the case of three holes The actual optical centre is shown by the red cross 42 Figure 24 Example of a projection
10. for this range is set to 0 10 The user can change this value when defining the CAN EYE parameters which also concerns the description of the hemispherical lens properties at the beginning of the processing FAPAR computation fAPAR is the fraction of absorbed photosynthetically active radiation 400 700nm by the vegetation It varies with sun position As there is little scattering by leaves in that particular spectral domain due to the strong absorbing features of the photosynthetic pigments Andrieu and Baret 1993 fAPAR is often assumed to be equal to fIPAR fraction of Intercepted photosynthetically active radiation and therefore to the gap fraction The actual fAPAR is the sum of two terms weighted by the diffuse fraction in the PAR domain the black sky fAPAR that corresponds to the direct component collimated beam irradiance in the sun direction only and the white sky or the diffuse component The closest approximation to white sky fAPAR occurs under a deep cloud cover that may generate an almost isotropic diffuse downward Following Martonchik et al 2000 578 the adjectives black and white are not related to the color of the sky but rather to the angular distribution of light intensity Providing the latitude and the date of the image acquisition the CAN EYE software proposes three outputs for fAPAR 1 The instantaneous black sky fAPAR fAPAR it is the black sky fAPAR at a given solar position date ho
11. processing angular resolution FAPAR computation This allows making some pre computation that are saved to speed up the processing when using the system for another set of images acquired in the Same conditions Note that CAN EYE supports only two fish eye projection types gt Polar projection function the angular distances in degrees in the object region are proportional to radial distances in pixels on the image plane gt Projection function assuming that the angular distances in degrees in the object region are related to radial distances in pixels on the image plane with a polynomial function order 1 or 2 It is possible within CAN EYE to determine the camera fish eye lens characteristics For more information see 5 Description of the processing parameter window for DHP gt User Name by default will display the user name environment system variable of the computer gt Comment add any useful comment will be written in the processing report amp output files gt Image size automatically filled in by CAN EYE gt Optical centre amp Projection function these characteristics can either be loaded from a matlab file generated with the CAN EYE calibration menu 5 or directly provided by the user if you choose create see hereafter gt COlI is the limit of the image in degrees used during the processing By default set to 0 60 zenith angles higher gt 60 are not taken into account due to la
12. the soil Once the user has chosen one of these two options he is asked to choose a processing directory that contains a series of images either jpg or tiff format that he wants to process at the same time Then the user is asked to provide the CAN EYE processing parameters Results are stored in a subdirectory called CE NADIR Directoryname CAN EYE 6 1 l D x Calibration Hemispherical Images Images at 57 Images at nadir 0 Summary Help Exit w Help on Images at nadir 7 Downward Figure 5 CAN EYE Images at nadir 0 menu CAN EYE user manual Page 14 CAN EYE USER MANUAL NYA T SCIENCE amp IMPACT 3 2 Defining the processing parameters Once the directory containing the images to be processed is selected the user must provide the processing parameters including optics characterization and CAN EYE specific parameters that differ with the image type Information must be manually entered in a specific window described hereafter The processing parameters are then stored in the Param_V6 subdirectory created where Can_Eye exe file is located with a default proposed name For hemispherical images some pre computation are also performed and save in order to save time when processing new directories Processing parameters for DHP This window allows defining all the camera fish eye lens characteristics optical centre projection function as well as the characteristics required for the
13. which the useful part is not already extracted the original image size is kept and are issued from an external processing this allows deriving CAN EYE variables from images classified with other tools than CAN EYE The files must be stored in a zip file called CIE DirectoryName that CAN EYE user manual Page 12 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT includes binary files named ImageName cie coded in unsigned integer 8 bits uint8 Gap Fraction is between O and 100 O vegetation 100 gap between mixed pixels Invalid values corresponding to masked areas 255 For already classified images the user can either choose a directory that contains CIE_name zip or CNE_name zip or a directory that contains several directories themselves containing one or several CIE _name zip kind of batch process For each CIE_name zip results will be stored in a subdirectory called CE P180 CIE name J CAN EYE V6 1 ml Calbration Hemispherical Images Images atS7 Images atnadi 0 Summary Help Ext Figure 3 CAN EYE Hemispherical Images menu Images acquired with a camera inclined at 57 5 Select Images at 57 This Menu allows the processing of images acquired with a camera inclined at 57 5 from the vertical For this particular direction the gap fraction is independent on leaf inclination angle Weiss et al 2003 This allows the derivation of L
14. 2 classes This option allows you to classify pixels belonging to one class only and all the pixels that remain unclassified are assumed to belong to the other class In this example the user has selected the soil This implies that he will have to select all the pixels belonging to that class The other will be attributed to vegetation Note that if the user is processing images acquired downwards the gaps are named soil while if he is processing upward images the gaps are named sky 2 Classes Mixed Pixels In that case it means that the unclassified pixels will be considered as mixed The user will have to classify gt all the pixels that he knows for sure belonging to the soil gt all the pixels that he knows for sure belonging to the green vegetation gt all the remaining pixels are considered as mixed and will be processed later as such gap fraction is computed as the weighted average between soil 0 and green vegetation 1 Weights are computed as the distance in terms of colour between the pixels and the two classes CAN EYE user manual Page 25 CAN EYE USER MANUAL NYA T SCIENCE amp IMPACT CAN EYE USER MANUAL NYA SCIENCE amp IMPACT 3 6 Classifying the images Once the classes are defined the classification window appears on the screen The classification module is divided in four menus Figure 15 Classification window gt Menu I top le
15. 37 Manages ACO EE EE A EE EEA 37 lmas ACO at Naadik cacvesscmanacaxseussistecanseansucitenssnenseannan ssi EA EE EAA A 37 6 CALIBRATION MODULE DHP only errrrrrorrrrrnnnnnnnrrrvnnrrnnnnnnnnnrvvnnnnrnnsnnnnnvvrrnnsssnenn 38 VPN 39 oe 39 63 Proecion TUNCHONCHATAC LEM ZA Oissar 42 RNA 47 CAN EYE user manual Page 3 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT List of Figures Figure 1 Overview of the CAN EYE processing of a Series Of 1mages rrrrrrrrrrrrrrrrrrrrrrrrrrrnr 8 Figure 2 Example of the Maize directory containing a series of 9 images JPG corresponding to one ESU and to be processed concurrently rrnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn 12 Figure 3 CAN EYE Hemispherical Images menu rrrrrrrnnnnnnnnnnrrrrrvrnnverrrrrrrrrrrssrrrrnnnnnnnnnnnr 13 Figure 4 CAN EYE Images at nadir 0 menu cee ccccccccceceeeceeeeeeeeeeeeceeeeeeeeeeeeeeeeeeeeess 14 Figure 5 CAN BYE Images at nadir 0 MCU vvs seere 14 Figure 6 Processing parameter window for hemispherical images ccccccccscscccseeeeeeeeees 17 Figure 7 Creating projection function and optical centre characterist1cs rrrrrrrrrrrrrrrrrrrnr 17 Figure 8 Processing parameter window for images acquired at 57 5 mmmrnnnnnnrnnnnnnnnnnnnnnrnrrrrrrnn 18 Figure 9 Processing parameter window for images acquired at nadir 09 rrrorrrrrrrrrrnnrnnrnnr 20 Froure 10 Selecting PT NA NE secorre paa EE
16. 963 showed that for a view angle of 57 5 the G function Eq 4 can be considered as almost independent on leaf inclination G 0 5 Using contact frequency at this particular 57 5 angle Warren Wilson 1963 derived leaf area index independently from the leaf inclination distribution function within an accuracy of about 7 Bonhomme et al 1974 applied this technique using the gap fraction measurements and found a very good agreement between the actual and estimated LAT values for young crops Therefore for this particular viewing direction LAI can be easily deduced from gap fraction Eq 8 Po 57 5 exp 0 5LAI c0s 57 5 amp LAI 12 P0037 3 ad The CAN EYE software proposes an estimate of the LAI derived from this equation called LAI57 Use of multiple directions LAIeff ALA eff Among the several methods described in Weiss et al 2004 the LAI estimation in the CAN EYE software is performed by model inversion since conversely to the use of the Miller s formula it can take into account only a part of the zenith angle range sampled by hemispherical images This is very useful since there is a possibility to reduce the image field of view to less than 90 zenith This feature is very important due to the high probability of mixed pixels in the part of the image corresponding to large zenith view angles LAI and ALA are directly retrieved by inverting in CAN EYE using Eq 6 and assuming an ellipsoidal distribution of the leaf inclin
17. AI only It is possible to directly process gt RGB images jpeg or tiff acquired with the system o upward camera on the ground looking at the sky o downward camera above the canopy looking at the soil gt Already Classified Images Binary i e images that have already been classified Two Image type can be taken into account o Classified images from which the useful part has been already extracted i e the lens calibration was into account FOV focal length such as CAN EYE provides automatically during a processing intermediate results i e classified images are stored in a zip file called CNE DirectoryName that includes i A header file ASCII file CNE DirectoryName hdr with too lines First line provides the height of the binarised images second line provides the width of the binarised images ii Binary files named ImageName cne coded in unsigned integer 8 bits uint8 Gap Fraction is between 0 and 100 O vegetation 100 gap between mixed pixels Invalid values corresponding to masked areas 255 o Classified images from which the useful part is not already extracted the Original image size is kept and are issued from an external processing this allows deriving CAN EYE variables from images classified with other tools than CAN EYE user manual Page 13 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT CAN EYE The files must be stored in a zip file called CIE DirectoryName that in
18. CAN EYE USER MANUAL 1 3 2 CCD 1 1 8 CCD 1 1 8 CCD 1 1 8 CCD 1 3 CCD 1 2 7 CCD Minolta DIMAGE 7i 2 8 00D 66 2 3 CCD 66 1 2 CCD 1 1 8 CCD 1 2 7 CCD 1 1 8 CCD Nikon Coolpix 5000 1 2 3 CCD 66 1 1 8 CCD Nikon Coolpix 5700 2 8 00D 66 1 2 7 CCD 1 1 8 CCD 1 2 7 CCD 1 1 8 CCD 1 1 8 CCD Sony DSC F717 1 2 8 00D 66 1 2 7 CCD Table 1 CCD sensor size for different camera models CAN EYE user manual SCIENCE amp IMPACT Page 19 Processing parameters for images acquired at nadir This window allows defining all the camera characteristics field of view as well as the characteristics required for the processing gt gt User Name by default will display the user name environment system variable of the computer Comment add any useful comment will be written in the processing report amp output files Image size automatically filled in by CAN EYE CCD sensor width Provide the camera CCD sensor width This allows the extraction of the useful part of the images i e 0 5 to compute the cover fraction The CCD sensor width depends on the camera Table provide a list of camera models and corresponding sensor size sources http homepages tig com au parsog photo sensors1 html http www dpreview com news 0210 02100402sensorsizes asp Sub Sample Factor in case you are processing very high resolution images and or a low field of view camera and or a hig
19. CT Selection of one image where a mask must be defined rrrrrrrrrrrrrrrrrrrrrnnnnr 23 APPLY ALL Apply last mask to all images rrrrrnnnnnonnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn 24 UNDO ik 24 ETR 24 DONE FN 24 SLOPE Denne SOME cssedchssscssienonrddsasuncderarcssaiebsnhaaneantonnddadsanasdisdtocakinesetasdahacbesssaseandisdioncens 25 Jas CLASS INO cerisier e a E e e rE iE 25 NT 25 LE OE VGA DEE SE SE SEE EE eae 25 Tr 26 4 CAN EYE OUTPUT DESCRIPTION vicserssssssovecesensannentensendsanniseasuwasaniseaateeessasasuontuevencies 27 4 1 Definitions and theoretical background ccccccccccccccccccceeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeees 28 CAN EYE user manual Page 2 CAN EYE USER MANUAL NYA He deanna 28 Noss Tr 28 Modeling the leaf inclination distribution function g 1 0 Q cessere 29 Estimating leaf area index and leaf inclination from gap fraction measurements 30 Go de Ga op ak computa ON RR ners pacer aay casos ats ore cues smpanas dean slonieeee tenn eosaeGaoeduces 33 TAPAR OMDR O aen AEA EE ee eee See 33 4 2 Description of CAN EYE output directory content orrrrrrrrorrrrrrrrrrrrrnrrrrnnrrrnrnrnnnnr 34 Hemispherical images 06 5 0 s 0ecseandansdnnnavessennsondadedavernntedansivndestatshavessntniessbennteecseandesteventes 34 Vi OCS ae T E a E atone eesoneientnneermstes 35 Mao eea aroa a N aa EE E A AST 36 gt UMMARY MODULE oa E E E 36 La Kr 100 od EE TNA SS E A EN
20. ESSED IMAGES MASK screen copy of the working window after the masking step CAN EYE user manual Page 35 CAN EYE USER MANUAL NYA kran cv ts Msc mill SCIENCE amp IMPACT gt CLASSIFICATION RESULTS screen copy of the working window after the classification step gt AVERAGE BIOPHYSICAL VARIABLES variables estimated from the series of images gt GAP FRACTION Plot of P57 black and PAI value as a function of individual image number The excel file named CE P57 Report _xxx xls Contains several sheets gt CAN EYE P57 parameters description of the parameters used for the processing GENERAL PARAMETERS and CALIBRATION PARAMETERS in the html report gt CAN EYE P57 Results table with P57 effective and true PAI values Images acquired at Nadir Images acquired at nadir results are contained in a subdirectory called CE NADIR xxx where xxx is the name of the directory that contains the images to be processed The html report is named CE NADIR Report xxx html It contains 10 sections gt GENERAL INFORMATION CAN EYE version User processing Date Processing Duration CALIBRATION PARAMETERS image size camera model sub sampling factor SELECTED IMAGES table containing the list of the image file names used for the processing and the corresponding gap fraction at nadir estimates NUMBER OF CLASSES number of classes that was selected 2 or 3 2 mixed see 3 5 PROCESSED IMAGES MASK screen
21. G corresponding to one ESU and to be processed concurrently Hemispherical images Select Hemispherical Images This Menu allows the processing of DHP images acquired with a camera fish eye lens This allows the derivation of LAI ALA FAPAR and FCOVER It is possible to directly process gt RGB images jpeg or tiff acquired with the system o upward camera on the ground looking at the sky o downward camera above the canopy looking at the soil Results will be stored in a sub directory created in the image directory and called CE P180 imagedirectory gt Already Classified Images Binary i e images that have already been classified Two image type can be taken into account o Classified images from which the useful part has been already extracted i e the fish eye lens calibration have been taken into account FOV COI such as CAN EYE provides automatically during a processing intermediate results i e classified images are stored in a zip file called CNE_DirectoryName that includes i A header file ASCII file CNE_DirectoryName hdr with too lines First line provides the height of the binarised images Second line provides the width of the binarised images ii Binary files named ImageName cne coded in unsigned integer 8 bits uint8 Gap Fraction is between O and 100 O vegetation 100 gap between mixed pixels Invalid values corresponding to masked areas 255 o Classified images from
22. P 1997 A method for estimating canopy openness effective leaf area index and photosynthetically active photon flux density using hemispherical photography and computerized image analysis technique BC X 373 Can For Serv Pac For Cent Inf Knyazikhin Y Martonchik J V Myneni R B Diner D J and Running S W 1998 Synergistic algorithm for estimating vegetation canopy leaf area index and fraction of absorbed photosynthetically active radiation from MODIS and MISR data J Geophys Res 103 D24 32257 32275 Lang A R 1991 Application of some of Cauchy s theorems to estimation of surface areas of leaves needles and branches of plants and light transmittance Agric For Meteorol 55 191 212 Miller J B 1967 A formula for average foliage density Aust J Bot 15 141 144 Nilson T 1971 A theoretical analysis of the frequency of gaps in plant stands Agric Meteorol 8 25 38 Ross J 1981 The radiation regime and architecture of plant stands The Hague 391 pp Wang Y P and Jarvis P G 1988 Mean leaf angles for the ellipsoidal inclination angle distribution Agric For Meteorol 43 319 321 Warren Wilson J 1959 Analysis of the spatial distribution of foliage by two dimensional point quadrats New Phytol 58 92 101 Warren Wilson J 1960 Inclined point quadrats New Phytol 59 1 8 Warren Wilson J 1963 Estimation of foliage denseness and foliage angle by inclined point quadrats Aus
23. am V6 of the directory where the Can Eye Exe file is located This parameter file can be used later on to process images issued from the same system camera fish eye lens Note that this allows performing some pre computations and it will soeed up future processing gt The default parameter file name for the specific window shown here as an example iS P180 2112 2816 Cent 1063 1390 ProjFuncDeg1 COI60 Sub1 Teta5 Phi20 FCov1 O mat CAN EYE Parameterization x GENERAL PARAMETERS User Hame I WEISS Comment ENN CALIBRATION PARAMETERS Image Site ooo LINES 2112 COLUMHNS 2816 Optical Center amp Projection Function Load from matlab file Create COL 60 Sub Sample Factor 1 7 gt ANGULAR RESOLUTION zenith 10 Azimuth 20 FCaver 10 FAPAR COMPUTATION Day Of Year 134 Latitude 43 SAVING DATA CAN EYE user manual Page 16 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT Figure 6 Processing parameter window for hemispherical images Creating Projection Function and Optical Centre Characteristics When pressing the create button in the CALIBRATION PARAMETERS subsection the user must enter the characteristics of the lens used to acquire the images gt Optical Centre location of the optical centre along the lines Y and rows X knowing that the upper left corner of the image has coordina
24. ation distribution function This induces the two normalization conditions given in Eq 5a and Eq 5b LET 8 0 sin0 d0 d a Eq 5 277 2 l l re G 0 9 sin0 d0 dp gt b The contact frequency is a very appealing quantity to indirectly estimate LAI because no assumptions on leaf spatial distribution shape and size are required Unfortunately the contact frequency is very difficult to measure in a representative way within canopies This 1s the reason why the gap fraction is generally preferred In the case of a random spatial distribution of infinitely small leaves the gap fraction P 0 o in direction 0 9 IS v related to the contact frequency by Eq 6 P 0p eNO e OP HEAT 00800 i 0 v gt ry This is known as the Poisson model Conversely to the contact frequency that is linearly related to LAI the gap fraction is highly non linearly related to LAI Nilson 1971 demonstrated both from theoretical and empirical evidences that the gap fraction can generally be expressed as an exponential function of the leaf area index even when the random turbid medium assumptions associated to the Poisson model are not satisfied In case of clumped canopies a modified expression of the Poisson model can be written ACG 0 9 LAI cos A Eq 7 P 0 9 e where is the clumping parameter A lt 1 Modeling the leaf inclination distribution function e L 0 2 As shown p
25. ation using look up table techniques Knyazikhin et al 1998 Weiss et al 2000 A large range of random combinations of LAI between 0 and 10 step of 0 01 and ALA 10 and 80 step of 2 values is used to build a database made of the corresponding gap fraction values Eq 6 in the zenithal directions defined by the CAN EYE user parameter window definition during the CAN EYE processing The process consists then in selecting the LUT element in the database that is the closest to the measured P The distance cost function Cx of the k element of the LUT to the measured gap fraction is computed as the sum of two terms CAN EYE user manual Page 30 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT Nb Zenith Dir 2 w PLY 9 PMES ALA _ 60 vel 0 30 I CAN EYE V5 1 Eq 7 J as First Term Second Term Nb _ Zenith _ Dir 2 w PET 9 PMS 9 PALLY _ PA CAN EYE V6 1 Eq 8 4 Oyop P 0 O PAI57 First Term Second Term The first term computes a weighted relative root mean square error between the measured gap fraction and the LUT one The weights w take into account the fact that some zenithal directions may contain a lot of masked pixel and therefore the corresponding gap fraction may not be very representative of the image _ NPix Nmask Sm A NPix E Eq 9 The relative root mean square error is divided by a modelled standard deviation o
26. camera set up and direction The image name is not important except the postfix that should be either tif or jpg Any names are therefore accepted and can be tracked later through the processing and report files CAN EYE user manual Page 11 CAN EYE USER MANUAL NRA SCIENCE amp IMPACT fr D Home DATA CAN_EYE Maize Fichier Edition Affichage Favoris Outils ay 60 9 27 HG9x Lic hg Nom e Taille Type Date de modification SY FDSCNO397 ESU40 SUN CAL JPG 1 361 Ko Image JPEG 16 01 2003 23 38 Dimensions 1889 x 1417 Dossiers x gt Calibration aj LOGO DOC 4 O F2 O Gilching O Maize O Test 57 OI v1 4 Hem OD 1 4_P57 FDSCNO398 ESU40 SUN JPG FOSCNO399_ESU40_SUN JPG FOSCNO400_ESU40_SUN JPG FOSCNO401_ESU40_SUN JPG FOSCNO402_ESU40_SUN JPG FOSCNO403_ESU40_SUN IPG FOSCNO404_ESU40_SUN IPG i FDSCN0405_ESU40_5UN JPG 536 Ko 1 544 Ko 1 343 Ko 1 418 Ko 1 502 Ko 1 410 Ko 1 493 Ko 1 536 Ko Image JPEG Image JPEG Image JPEG Image JPEG Image JPEG Image JPEG Image JPEG Image JPEG 17 01 2003 02 13 16 01 2003 23 38 16 01 2003 23 38 16 01 2003 23 38 16 01 2003 23 39 16 01 2003 23 39 16 01 2003 23 39 16 01 2003 23 39 1889 x 1417 1889 x 1417 1889 x 1417 1889 x 1417 1889 x 1417 1889 x 1417 1889 x 1417 1889 x 1417 O v3 2 dl mr Figure 2 Example of the Maize directory containing a series of 9 images JP
27. canopy structure i e the relative position of the plants in the canopy The shape and size of leaves might also play an important role on the clumping In CAN EYE the clumping index is computed using the Lang and Yueqin 1986 logarithm gap fraction averaging method The principle is based on the assumption that vegetation elements are locally assumed randomly distributed Each zenithal ring is divided into groups called cells of individual pixels The size of the individual cells must compromise between two criterions it should be large enough so that the statistics of the gap fraction are meaningful and small enough so that the assumption of randomness of leaf distribution within the cell is valid For each cell Po is computed as well as its logarithm If there is no gap in the cell only vegetation i e P 0 Po is assumed to be equal to a Ps value derived from simple Poisson law using a prescribed LAIS value Pr 9 as well as In P 9 are then averaged over the azimuth and over the images for each zenithal ring The averaging still takes into account the masked areas using w The ratio of these two quantities provides the clumping parameter for each zenithal ring OAA E kr logimean P 0 Note that since P is simulated using the Poisson model it depends on the value chosen for both LA S and the average leaf inclination angle the clumping parameter is computed for the whole range of variation of ALA and a LAIS var
28. chniques are all based on contact frequency Warren Wilson 1959 or gap fraction Ross 1981 measurements Contact frequency is the probability that a beam or a probe penetrating inside the canopy will come into contact with a vegetative element Conversely gap frequency is the probability that this beam will have no contact with the vegetation elements until it reaches a reference level generally the ground The term gap fraction is also often used and refers to the integrated value of the gap frequency over a given domain and thus to the quantity that can be measured especially using hemispherical images Therefore measuring gap fraction 1s equivalent to measuring transmittance at ground level in spectral domains where vegetative elements could be assumed black It is then possible to consider the mono directional gap fraction which 1s the fraction of ground observed in a given viewing direction or in a given incident direction The objective of this section is to provide the theoretical background used in the CAN EYE software to derive canopy biophysical variables from the bi directional gap fraction measured from the hemispherical images Modeling the Gap Fraction LAI definition The leaf area density h at level h in the canopy is defined as the leaf area per unit volume of canopy The leaf area index LAI corresponds to the integral of I h over canopy height It is therefore defined as the one sided leaf area per unit hor
29. cludes binary files named ImageName cie coded in unsigned integer 8 bits uint8 Gap Fraction is between O and 100 O vegetation 100 gap between mixed pixels Invalid values corresponding to masked areas 255 Results will be stored in a subdirectory called CE P57 Directoryname For already classified images the user can either choose a directory that contains CIE _name zip or CNE_name zip or a directory that contains several directories themselves containing one or several CIE _name zip kind of batch process For each CIE_name zip results will be stored in a subdirectory called CE P57 CIE name CAN EYE 6 1 IOl x Calibration Hemispherical Images Images at 57 Images at nadir 0 Summary Help Exit ar RGB images Upward Already Classified Images Binary Downward Help on Images at 57 Figure 4 CAN EYE Images at nadir 0 menu Images acquired with a camera at the vertical position nadir Select Images at nadir 0 This Menu allows the processing of images acquired with a camera looking at the ground vertical This allows the derivation of FCOVER only This menu allows the processing of image acquired with a camera no fish eye lens at the ground vertical It is possible to directly process gt RGB images jpeg or tiff acquired with the system o upward camera on the ground looking at the sky o downward camera above the canopy looking at
30. copy of the working window after the masking step CLASSIFICATION RESULTS screen copy of the working window after the classification step GAP FRACTION bar plot of nadir gap fraction as a function of individual image number Average cover fraction is shown in red Y Vv Y VW VW The excel file named CE Nadir Report _xxx xls Contains several sheets gt CAN EYE nadir Parameters description of the parameters used for the processing GENERAL PARAMETERS and CALIBRATION PARAMETERS in the html report gt CAN EYE Nadir Results table with average and individual FCOVER value 5 SUMMARY MODULE CAN EYE user manual Page 36 CAN EYE 6 1 loj x Calibration Hemispherical Images Images at 57 Images at nadir 07 Summary Help Md Poy MADIR Help on Summary Figure 16 CAN EYE Images at nadir 0 menu The Summary Menu allows gathering CAN EYE result processing Excel file in a single one After selecting the processing type DHP P180 images acquired at 57 P57 and images acquired at NADIR the user is asked to choose a directory to proceed The chosen directory should contain CAN EYE processing results sub directories all called CE P180 or CE P57 depending on the processing type The user is then asked to provide a name for the summary excel file in which all the results will be stored Hemispherical images Six sheets are generated gt Effective PAI columns contents are in order directory na
31. eft side of the window either to the slider changing the gamma value The gamma can be reset to its initial value reset button Click OK when the image appears easily classifiable for you Gamma O Change the gamma value to improve visualisation 1 00 4 k RESET OK Figure 12 Gamma correction value masking process SELECT Selection of one image where a mask must be defined After clicking this button the user has to select the image he wants to mask mouse left click on the desired image The image then appears solely on the window with 3 buttons available MASK After clicking on this button the user has to define a polygon corresponding to the masked area on your image by clicking the left mouse button to indicate the polygon vertices The user can then visualize the drawn polygon see figure below To end the masking the user just clicks on the right mouse button The user can add as many masks as he wants by selecting again the mask button Figure 13 Secondary masking window after having selected the image to be masked CAN EYE user manual Page 23 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT UNDO Clicking on his button cancels the last mask that was designed DONE Click on this button when you have finished the masking process on the image APPLY ALL Apply last mask to all images When clicking on this button the last drawn mask is applied to all the images see fig
32. esults from series of CAN EYE results in an excel file CAN EYE user manual Page 7 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT CAN EYE USER MANUAL NYA SCIENCE amp IMPACT Parameter definition Image Size Calibration parameters Circle of interest Angular resolution ANIRAA RESOLUTION eran ey fro amanie townie gt TAPAS COMPUTATION Day OF Yeu Di iant 9 Output Generation HTML report excel file 7 Look Up Table using Poisson Model Nilson 1971 Ellipsdidal leaf inclination distribution distribution Campbell 1984 Leaf Clumping LAI ALA fAPAR FCOVER gap fraction Classification interactive proces Rough Classification Figure 1 Overview of the CAN EYE processing of a series of images CAN EYE user manual Page 8 CAN EYE USER MANUAL NYA T SCIENCE amp IMPACT 1 3 Hardware and software requirements CAN EYE is a software developed using MATLABO language and can be installed on any computer working under windows OS system It is a compiled version that does not require to install MATLABO on the computer To use CAN EYE you have first to install the Matlab Component Runtime MCR MCR is a collection of libraries required to run compiled MATLAB codes It essentially includes all the existing MATLABO libraries but does not require a license to be installed Installing the MCR as
33. f the measured gap fraction derived from the empirical values o P 6 computed from the images corresponding to the same plot for each zenithal direction I when estimating the measured gap fraction after the CAN EYE classification step In order to smooth o zenithal variations a second order polynomial is fitted on o P 6 to provide oyop Py G The second term of Eq 7 and Eq 8 is the regularization term Combal et al 2002 that imposes constraints to improve the PAI estimates Two equations are proposed gt Constraint used in CAN EYE V5 1 on the retrieved ALA values that assumes an average leaf angle close to 60 30 gt Constraint used in CAN EYE V6 1 on the retrieved PAI value that must be close from the one retrieved from the zenithal ring at 57 This constraint 1s more efficient and not does not suppose any assumption than in Eq 7 but it can be computed only when the 57 ring is available COI gt 60 The LUT gap fraction that provides the minimum value of J is then considered as the solution The corresponding LAI and ALA provide the estimate of the measured CAN EYE leaf area index and average leaf inclination angle As there is no assumption about clumping in the expression of the gap fraction used to simulate the LUT Eq 6 the foliage is assumed randomly distributed which is generally not the case in actual canopies Therefore retrieval of LAI based on the Poisson model and using gap fraction measu
34. ft if the user selects TRUE COLOR then the original images are displayed If the users selects CLASSIF then the original images are displayed with the classification colours pixels already classified are shown in their class colour except if they are mixed case of more than 2 classes chosen or if they belong to the second class case of classification without considering mixed pixels Selecting HELP leads to display this help page gt Menu 2 bottom four buttons are available UNDO undo last pixel selection RESET reset all the classification process DISPLAY ALL this button is enable when only one image among all the images that are processed at the same time is displayed in the centre of the image and allows displaying all the image at the same time DONE click on this button when the classification process is achieved The user is then asked if he really want to end the classification process Selecting yes implies that CAN EYE will binarise the images and compute all the CAN EYE user manual Page 26 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT outputs Note that if no radio button is selected the user can zoom on a particular image simply by clicking on it and then choose DISPLAY ALL to see all the images together gt Menu 3 top right display the palette of all the reduced colours contained in the images Colours with red bullets represent more than 5 of all the pi
35. function sheet of the calibration excel file ronrnnrnnn 43 Figure 25 Experimentaldesion sene 2s0 c11 lt dasesiancdonsdecsenersqudseshoddseusubeeetepsmovedonsteOienetoameersess 44 Figure 26 Example of an image of the experimental design taken with the hemispherical camera and used for the calibration of the projection function The horizontal doted yellow line corresponds to the diameter of the image passing through the optical centre defined by its coordinates as measured previously The camera is aligned thanks to the front nail and Da KE TOVKA REE 45 Figure 27 Example of projection function characterization with CAN EYE 008 46 CAN EYE user manual Page 4 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT 1 INTRODUCTION CAN EYE V6 1 is a free software developed at the EMMAH laboratory Mediterranean environment and agro hydro system modelisation in the French National Institute of Agricultural Research INRA The authors remind that this software is a didactic product made only for pedagogic uses It is protected in France by Intellectual Property Regulations and abroad by international agreements on the copyright It can be downloaded at http www 6 paca inra fr can eye For any information question or bug report please contact can eye Qavignon inra fr 1 1 CAN EYE specific features CAN EYE is an imaging software working under Windows used to extract the following canopy structure cha
36. gt GENERAL INFORMATION Can EYE version User processing Date Processing Duration gt GENERAL PARAMETERS Angular resolution Circle of Interest FCOVER integration domain sub sampling factor see 3 2 gt CALIBRATION PARAMETERS Optical Centre projection Function Coefficient see 3 2 gt SELECTED IMAGES table containing the list of the image file names used for the processing as well as the corresponding FCOVER estimates gt NUMBER OF CLASSES number of classes that was selected 2 or 3 2 mixed see 3 5 gt AVERAGE BIOPHYSICAL VARIABLES variables estimated from the series of images Note that if the FCOVER can be estimated for each image this is not the case for the other variables that are derived from the mono directional gap fraction averaged over all the processed images gt PROCESSED IMAGES MASK screen copy of the working window after the masking step gt CLASSIFICATION RESULTS screen copy of the working window after the classification step gt AVERAGE GAP FRACTION polar plot of the average bi directional gap fraction rings correspond to zenithal directions Masked areas are shown in red gt CLUMPING FACTOR graph showing the comparison between the computed clumping factor green points and the modelled one red line as a function of view CAN EYE user manual Page 34 CAN EYE USER MANUAL NYA es SCIENCE amp IMPACT zenith angle for the computation of true LAI Average leaf i
37. h number of image your computer may be out of memory Use the subsample factor SSF to process only one pixel over SSF of the images Output Format results can be written either in an excel file or ascii file or both User Name USER User Comments H _ WHATEVER Image Characteristics LINES 1536 SAMPLES 2048 CCD Sensor Width mm Focal Length 8 2mm Sub Sample Factor Output Format EXCEL r ASCII Figure 9 Processing parameter window for images acquired at nadir 0 CAN EYE user manual Page 20 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT CAN EYE USER MANUAL at I an FILE INRA SCIENCE amp IMPACT MIP 3 3 SELECTING PERTINENT IMAGES Once the processing parameters are defined a window is opened showing all the images contained in the processing directory Some of the images may not be pertinent for the processing fuzzy images for example they must be eliminated 146 DSCNO015 JPG 246 OSCNOO16 JPG 43 6 DSCNOO1 JPG 46 DSCNODI8 JPG 546 DSCNOD19 JPG 46 6 DSCNOO20 JPG TRASH IMAGE UNDO RESET OK Figure 10 Selecting of pertinent images TRASH IMAGE click on this button and select the non pertinent image s with the left mouse button The selected images will become white which means that they won t be used for the processing UNDO click on this button to cancel your last selection RESET click on this button to rese
38. heet grey cells must remain unchanged while yellow cells must be filled the data corresponding to a given calibration Note that for a given camera fish eye system it is required to make a calibration experiment for each image resolution that will be used with the fish eye In order to reduce the problem of mixed pixels it is highly recommended to use the highest resolution providing jpg or tiff files Calibration Hemispherical Images Images at 57 Images at nadir 07 Help Exit M SE Function Help on Calibration Figure 17 CAN EYE calibration menu The Start sheet describes the calibration experiment for traceability Figure 18 Name of the user Name of the CAMERA Reference and the image resolution in columns X direction and lines Y resolution This can be achieved by looking at an image file properties in the window explorer or with any image processing software Ee home Jroorenee O o O O Resi 2 Nadine Bertrand PANASONIC DMC FZ PAMA EMMAH 1 a ee ET Do not change the cells that appear in gray 6 Do not change the sheet names 7 Fill in cells appearing in yellow IA Start Optical Centre Projection Function Results 4 Figure 18 Example of the Start sheet of the calibration excel file CAN EYE user manual Page 38 CAN EYE USER MANUAL NYA Mee ren SCIENCE amp IMPACT The Results sheet contains the calibration parameter of your camera fish eye system derived fr
39. igure 27 Example of projection function characterization with CAN EYE CAN EYE user manual Page 46 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT 7 REFERENCES Andrieu B and Baret F 1993 Indirect methods of estimating crop structure from optical measurements In R B C Varlet Grancher H Sinoquet Editor In Crop structure and light microclimate Characterization and Applications INRA Paris France pp 285 322 Andrieu B and Sinoquet H 1993 Evaluation of structure description requirements for predicting gap fraction of vegetation canopies Agric For Meteorol 65 207 227 Bonhomme R Varlet Grancher C and Chartier P 1974 The use of hemispherical photographs for determining the leaf area index of young crops Photosynthetica 8 3 299 301 Campbell G S 1986 Extinction coefficients for radiation in plant canopies calculated using an ellipsoidal inclination angle distribution Agric For Meteorol 36 317 321 Campbell G S 1990 Derivation of an angle density function for canopies with ellipsoidal leaf angle distributions Agric For Meteorol 49 173 176 Chen J M and Black T A 1992 Defining leaf area index for non flat leaves Plant Cell Environ 15 421 429 Espana M L Baret F and Weiss M 2008 Slope correction for LAI estimation from gap fraction measurements Agricultural and Forest Meteorology 148 10 1553 1562 Frazer G W Trofymov J A and Lertzman K
40. izontal ground surface area Watson 1947 Although this definition is clear for flat broad leaves it may cause problems for needles and non flat leaves Based on radiative transfer considerations Lang 1991 and Chen and Black 1992 and Stenberg 2006 proposed to define LAI as half the total developed area of leaves per unit ground horizontal surface area This definition 1s therefore valid regardless vegetation element shape As defined above leaf area index LAIT defined as at a level H in the canopy is related to the leaf area density through H Eq I LAI f I h dh From LAI to Gap Fraction Following Warren Wilson 1959 the mean number of contacts N H 0 g between a light beam and a vegetation element at a given canopy level H in the direction 0 is CAN EYE user manual Page 28 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT H Eq 2 N H 0 G h 0 p I h cosQ dh where G h 0 is the projection function i e the mean projection of a unit foliage area at level h in direction 0 4 When the leaf area density and the projection function are considered independent of the level A in the canopy Eq 2 simplifies in Eq 3 Eq 3 N L O 9 G 0 9 LAI cos0 The projection function is defined as follows 1 ie Eq 4 G 9 an leosy s 9 sin 0 d0 d a cosy cos cos 0 sin sind cos g 9 b where g 1s the probability density function that describes leaf orient
41. le located in the directory where the file CAN EYE VXXXX exe is installed must also be attached to your message Note that the time required to fix the bug will vary depending on the type of bug but also as a function of our availability the software if provided for free although we will try to do our best To facilitate the understanding of the bug report please inform us on the step where the bug appears Send also any additional information such as the number of images you are processing the used size and format how many classes Send eventually your parameter file as well as a sample of images or any information that could help us identifying the problem 2 INSTALLATION The CAN EYE installation is quite easy CAN EYE user manual Page 9 CAN EYE USER MANUAL NYA el T SCIENCE amp IMPACT gt Check the system type 32 bits 64 bits Select the My Computer icon on your desktop Right click the icon and select Properties from the menu Check the system type gt Download the corresponding matlab component runtime version R2012a 32 bits or 64 bits from the Mathworks web site http www mathworks fr products compiler mcr gt Install the Matlab Component Runtime MCR by clicking on MCR R2012a win32 installer exe or MCR R2012a win64 installer exe depending on your system type and follow the instructions It is required that you install the MCR with the Administrator rights gt Add the path to the
42. led in by CAN EYE CCD sensor width Provide the camera CCD sensor width This allows the extraction of the useful part of the images i e 0 5 to compute the cover fraction The CCD sensor width depends on the camera Table 1 provide a list of camera models and corresponding sensor size sources http homepages tig com au parsog photo sensors1 html http www dpreview com news 0210 02100402sensorsizes asp Sub Sample Factor in case you are processing very high resolution images and or a low field of view camera and or a high number of image your computer may be out of memory Use the subsample factor SSF to process only one pixel over SSF of the images Cell Size the true leaf area index is computed using the Lang amp Xiang average logarithm method The Cell Size in pixels corresponds to the size of the cell where you compute the local LAI Output Format results can be written either in an excel file or ascii file or both User Name WEISS User Comments BlaBla Image Characteristics LINES 2304 SAMPLES 3072 CCD Sensor Width mm 5 76 Focal Length 26 2mm Sub Sample Factor 1 mr Computation of clumping Index Cell Size pixels 30 Output Format EXCEL Iv ASCII on Figure 8 Processing parameter window for images acquired at 57 5 CAN EYE user manual Page 18 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT
43. lens camera system When selecting binarised images the following steps are not required and CAN EYE can be used in batch processing by selecting the directory that contains all the series of images to be processed to automatically generate the outputs Pre processing the images The images are then loaded and displayed in a window It is possible to select interactively the images not to be processed At this stage it is possible to mask parts of the images that are undesirable In addition the gamma factor can be changed to brighten or darken the images and provide a better visual discrimination between the vegetation elements and the background Figure 1 step 2 Classification in case of RGB images When the pre processing steps ends the number of colours is reduced to 324 which are sufficient to get good discrimination capacities while keeping small enough to be easily manipulated The classification is then the most critical phase that needs to be interactive because the colours associated to each class depend on the illumination conditions and on the objects themselves The class es are first to be defined Figure 1 step 3 It is generally more efficient to select a single class that corresponds to that having simple colours such as the sky or the less represented such as the green vegetation for sparse canopies or the soil background for the more dense canopies In this case the non classified pixels will be considered
44. m of 8 images is required Weiss et al 2003 No more than 20 images can be processed by CAN EYE at once Consequently if more that 20 images are available for a same canopy the user must organize them into 2 or more directories It is therefore not possible with CAN EYE to determine the LAI of a single tree or plant The illumination conditions should be about the same within a series of images If there are large differences in illumination conditions such as strong direct light or strong diffuse conditions it is recommended to split the series of images into homogeneous sub series The same applies obviously for photos taken over the same canopy but either looking upward or downward in the directory only images taken for a given direction up or down should be present CAN EYE user manual Page 10 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT CAN EYE accepts only TIFF tif and JPEG jpg images format or binary format for already classified images see Erreur Source du renvoi introuvable Erreur Source du renvoi introuvable Erreur Source du renvoi introuvable The images can be of any size resolution However all the images to be processed concurrently in parallel and stored in a single directory should have the same format size camera setup zoom as well as the same direction up or down If this is not the case create as many directories as combinations of format size as well as
45. me CE P180 file name CEV6 1 PAI estimate CEV5 1 PAI estimate Miller PAI estimate LAI2000 3 4 and five rings estimates processing date gt Effective ALA columns contents are in order directory name CE P180 file name CEV6 1 ALA estimate CEV5 1 ALA estimate Miller ALA estimate LAI2000 3 4 and five rings estimates processing date gt True PAI columns contents are in order directory name CE P180 file name CEV6 1 PAI estimate CEV5 1 PAI estimate Miller PAI estimate LAI2000 3 4 and five rings estimates processing date gt True ALA columns contents are in order directory name CE P180 file name CEV6 1 ALA estimate CEV5 1 ALA estimate Miller ALA estimate LAI2000 3 4 and five rings estimates processing date gt FCOVER columns contents are in order directory name CE P180 file name CEV6 1 FCOVER estimate processing date gt Daily fAPAR columns contents are in order directory name CE P180 file name measured direct FAPAR estimate measured modelled FAPAR estimate estimated direct FAPAR estimate estimated direct FAPAR estimate processing date Images acquired at 57 5 Two sheets are generated gt Mean columns contents are in order directory name CE P57 file name average gap fraction value at 57 effective PAI True PAI for different LAISAT value see LAltrue Processing date gt columns contents are in order directory name CEP57 file name image name individual gap fraction value at 57
46. nclination value as well as the RMSE between computed and modelled clumping factor are indicated see here gt MEASURED GAP FRACTION VS LUT GAP FRACTION amp AVERAGE PAI ALA FCOVER top graph shows the RMSE value between the mono directional gap fraction computed from the images and the closest one found in the LUT red line as a function of average leaf inclination angle value The green line shows the corresponding PAI value that provides the lowest RMSE as a function of average leaf inclination angle The bottom graph showed the mono directional gap fraction estimated from the images determined by the classification step as a function of view zenith angle green line The red line indicates the mono directional gap fraction of the LUT element that is the closest from the measurements assuming no clumping effect 1 e when estimating PAI and ALAgg In black the same is shown for the mono directional gap fraction when considering the clumping factor PAlTtre ALAtrue The excel file named CE P180 Report _xxx xls Contains several sheets gt CAN EYE P180 parameters description of the parameters used for the processing GENERAL PARAMETERS and CALIBRATION PARAMETERS in the html report PAI ALA different PAI and ALA estimates effective true from different methods CAN EYEVS 1 Can EYE V6 1 Miller LAI2000 3 4 5 rings FAPAR daily integrated value of direct and diffuse FAPAR measured and modelled as well as instantenous
47. ng system as well as defining the area of interest Interactive masking tools allow the user to eliminate parts of the photos CAN EYE user manual Page 5 CAN EYE USER MANUAL NYA mas T SCIENCE amp IMPACT contaminated by undesirable objects Interactive zooming tools are also included A specific tool is also implemented to taken into account the slope of the terrain gt Portability CAN EYE is very easy to install gt Traceability CAN EYE was developed such as all the processing steps are tracked For these purposes an html report and output documents excel format as well as intermediate results are automatically generated by the software 1 2 CAN EYE basic principles Figure I provides an overview if the different steps required to process series of images with CAN EYE which are described hereafter Set up of the processing The user first selects the type of images he has acquired DHP images at 57 Images at nadir if they are in RGB colours or binarised and acquired upward looking at the sky or downward looking at the soil After selecting the directory where the images to be processed are stored the user has to define the characteristics of the processing Default values are proposed for all the items This setup configuration can be saved and used to process another series of photos Figure 1 step 1 For hemispherical images a calibration method is proposed to characterize most of the fish eye
48. o not change the cells that appear in gray 22 Fill the yellow cells with X Y coordinates of the 3 holes columns for each image line 23 Some of the yellow lines can remain empty 24 4 Start Optical Centre Projection Function Results Figure 22 Example of an excel file sheet to be filled to determine the optical centre of a system Once the Optical Centre sheet is filled the user must run CAN EYE then go in the Calibration Menu and then select Optical Centre Can Eye asks then to select the excel file computes automatically the optical centre position and fill the Results sheet Figure 23 shows an example of optical centre adjustment in the case where three holes were considered Results show very consistent estimates of the coordinates of the optical centre that is known with accuracy better than one pixel CAN EYE user manual Page 41 Optical Centre Theoretical position Actual position hole1 hole2 hole3 Figure 23 Example of a CAN EYE output showing the fitting of the circles to the holes positions in the case of three holes The actual optical centre is shown by the red cross 6 3 Projection function characterization This section describes how to perform the measurements and fill the Projection Function sheet of the calibration excel file The experimental design is described in Figure 25 It consists in a frame of 50 50cm
49. om the calibration process the optical centre position as well as the maximum field of view FOV in degrees and the corresponding radius in pixels of the image These values can be directly entered in the CAN EYE parameter window during the processing of hemispherical images see Filling CAN EYE window parameter The contents of the two other sheets as well as the principles of the measurement are described in the following 6 1 System Definition An image can be defined by Figure 19 e The number of pixels in the horizontal direction Xsize e The number of pixels in the vertical direction Y size e The coordinates of the optical centre Xo Yo e The projection function P i e the function 0 P R that relates the view angle 0 relative to optical axis to the distance to the optical centre R X size Y size Figure 19 Image coordinate system 6 2 Optical centre characterization The optical centre is defined by the projection of the optical axis onto the CCD matrix where the image is recorded This point should therefore be invariant by rotation of the system along this optical axis A simple method to get the optical centre consists in observing the coordinates of a point when it rotates along this axis This could be achieved by drilling a small hole in the cap of the fish eye and acquiring photographs for a series of positions This is illustrated by Figure 20 It is possible to use several holes to check the consi
50. racteristics from true colour images either acquired with a fish eye or with a classic objective LAI Leaf Area Index ALA Average Leaf inclination Angle FAPAR Fraction of Absorbed Photosynthetically Active Radiation FCOVER Vegetation cover fraction Bidirectional amp mono directional gap fraction x CAN EYE is designed to process several images at once with optimal performances The images can be either RGB images or binarized It can be used to process 1 DHP Digital hemispherical images derivation of LAI ALA FAPAR FCOVER and gap fraction acquired with a fish eye camera system 2 images acquired with a camera inclined at 57 5 from the vertical derivation of LAl and mono directional gap fraction at 57 5 3 images acquired at nadir vertical camera to derive FCOVER CAN EYE has a set of specific features that improves its efficiency accuracy flexibility and traceability gt Efficiency a series of images is typically processed within 2 to 20 minutes depending on the complexity of the images the experience of the user and the performances of the used computer gt Accuracy the image conversion into a binarized image green vegetation other is performed through an interactive classification process This provides more flexibility to separate green elements from the sjy or the soil and allows acquiring images from both above or below the canopy gt Flexibility CAN EYE allows calibrating the imagi
51. rements will provide estimates of an effective LAI LAI and corresponding average inclination angle ALAeff that allows the description of the observed gap fraction assuming a random spatial distribution Note that CAN EYE also proposed other ways of computing PAT e and ALAcg using Miller s formula Miller 1967 which assumed that gap fraction only depends from view zenith angle CAN EYE user manual Page 31 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT mi2 Eq 10 PAl 2 In P 0 cos 0 sind do 0 Welles and Norman 1991 proposed a practical method to compute the integral of Eq 10 from gap fraction measurements in several directions for the LAI2000 instrument CAN EYE proposed the effective PAI estimates using both Miller and LAI2000 measurements For LAI2000 the ring angular response is taken into account and the computation is made of 3 4 and 5 rings Weiss et al 2004 From effective leaf are index to true LAI The true LAT that can be measured only using a planimeter with however possible allometric relationships to reduce the sampling Frazer et al 1997 is related to the effective leaf area index through Eq 11 LAI 3 LAI where IS the aggregation or dispersion parameter Nilson 1971 Lemeur and Blad 1974 or clumping index Chen and Black 1992 It depends both on plant structure i e the way foliage is located along stems for plants and trunks branches or shoots for trees and
52. reviously the gap fraction is both related to the leaf area index and the leaf inclination distribution function LIDF It 1s thus necessary to model the leaf inclination distribution function The azimuthal variation of the LIDF is often assumed uniform and this is the case in the CAN EYE software i e the probability density function g 0 p depends only on the leaf normal zenith angle This assumption is verified in many canopies but may be problematic for heliotropic plants like sunflowers Andrieu and Sinoquet 1993 CAN EYE user manual Page 29 CAN EYE USER MANUAL NYA meg ne SCIENCE amp IMPACT Among existing models the ellipsoidal distribution is very convenient and widely used Campbell 1986 Campbell 1990 Wang and Jarvis 1988 leaf inclination distribution is described by the ratio of the horizontal to the vertical axes of the ellipse that is related to the _ m2 average leaf inclination angle ALA variable in CAN EYE knowing that 9 2 0 0 d0 0 and that 2 0 is the probability density function that verifies the normalization condition Eq 5 Estimating leaf area index and leaf inclination from gap fraction measurements Use of a single direction LAI57 Considering the inclined point quadrat method Warren Wilson 1960 has proposed a formulation of the variation of the contact frequency as a function of the view zenith and foliage inclination angles Using this formulation Warren Wilson 1
53. rge occurrence of mixed pixels in these areas Do not use a COI value that is outside the domain used to calibrate the projection function gt Sub Sample Factor If the images are too numerous or too large the computer may not have enough memory possibility is to perform the processing using only one pixel over 2 Sub Sample Factor 2 or one pixel over 3 Sub Sample Factor 3 gt Angular Resolution in the zenith 0 and azimuth 0 directions determines the angles for which the Gap Fraction will be computed Low values induce higher CAN EYE user manual Page 15 CAN EYE USER MANUAL NYA gere SCIENCE amp IMPACT computation time By default it is set to the lowest value highest resolution for both angles gt FCover in degrees defines the size of the solid angle used to compute the cover fraction i e the gap fraction in the nadir direction By default it is set to 10 degrees gt FAPAR is computed as the integral of 1 gap fraction during the sun course The latter is determined for a given day provided in the day number of the year default is the acquisition date of the image at a given latitude in degrees default is 43 but you need to provide the latitude of your experiment site gt SAVING DATA results can be written in an excel file or ascii file or both gt SAVE click on this button once you have finished filling the parameters The parameter file will be saved in a subdirectory called Par
54. stency of the estimation of the optical centre Figure 21 CAN EYE user manual Page 39 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT Figure 20 Illustration of the holes drilled in the fish eye cap The red arrow indicates the rotation of the cap Figure 21 series of images taken for several positions of the fish eye cap In this case three holes were considered The rotation positions must be if possible as symmetric as possible in order to minimize possible biases in the estimation of the optical centre Once the photos are acquired 10 to 20 positions are enough the different holes as well as the image upper left corner coordinates must be extracted through an image processing software e g Paintshop Microsoft Photo Editor Then the user has to fill the Optical Centre sheet of the excel file Figure 22 It consists in typing the 3 hole coordinates in the X and Y directions for the different positions of the fish eye cap CAN EYE user manual Page 40 INRA SCIENCE amp IMPACT 1 Hole 1 Hole 2 Hole 3 Upper Left Corner 2 K Y X Y X Y X Y 3 1704 4556 1608 1720 1676 1868 0 0 4 1932 5 1984 6 T a CAN EYE USER MANUAL al ll M H i 1884 1740 1520 9 1264 ee E 10 1200 oan ra sof so az 11 1248 12 1434 13 MG SG AE GREEN 14 EG es Je ea 15 ME GE GE E 16 Je Er a Sry 17 Me Er Ge et Gs 18 E E E E d l D ST E 21 D
55. sures that the code behaves exactly like it would under MATLABO You will need to install the MCR only once However future CAN EYE updates may require installing a new MCR version but this will be indicated with the CAN EYE release The MCR can be downloaded from the CAN EYE website http www6 paca inra fr can_ eye 1 4 Copyright and limitations of responsibility The CAN EYE software and the accompanying documentation described herein are provided as freeware INRA reserves the right to make changes to this document and software at any time and without notice INRA shall not be liable for any direct consequential or other damages suffered by the use of the CAN_EYE software package or its documentation 1 5 Upgrading the version A dedicated web site is available for the CAN EYE software at the address http www6 paca inra fr can eye This site is regularly updated and new versions are available as well as a small FAQ When downloading CAN EYE you are asked to provide your e mail You will then be automatically informed of new releases announcements 1 6 Bug report CAN_EYE is aiming at being free from bugs However not all the situations have been tested and possible remaining bugs may exist It is therefore recommended to use always the latest CAN_ EYE version available However in case of bugs observed and to ease their correction please send a bug report to can eye Qavignon inra fr in order to fix the bug The file CAN EYE logfi
56. t J Bot 11 95 105 Watson D J 1947 Comparative physiological studies in growth of field crops I Variation in net assimilation rate and leaf area between species and varieties and within and between years Ann Bot 11 41 76 CAN EYE user manual Page 47 CAN EYE USER MANUAL NYA ler ve let SCIENCE amp IMPACT Weiss M Baret F Myneni R B Pragn re A and Knyazikhin Y 2000 Investigation of a model inversion technique to estimate canopy biophysical variables from spectral and directional reflectance data Agronomie 20 3 22 Weiss M Baret F Smith G J and Jonckheere I 2003 Methods for in situ leaf area index measurement part II from gap fraction to leaf area index retrieval methods and sampling strategies Agric For Meteorol submitted August 2002 Weiss M Baret F Smith G J and Jonckheere I 2004 Methods for in situ leaf area index measurement part II from gap fraction to leaf area index retrieval methods and sampling strategies Agric For Meteorol 121 17 53 Welles J M and Norman J M 1991 Instrument for indirect measurement of canopy architecture Agronomy J 83 5 818 825 CAN EYE user manual Page 48
57. t the selection in that case all the images in the directory are selected DONE click on this button when you have finished to select your pertinent images CAN EYE user manual Page 21 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT 3 4 MASKING IMAGES Some parts of the images must not be processed by the user because gt They do not correspond to vegetation or background Operator or undesired object are present on the images gt They will be very difficult to classify Presence of some over exposed areas in the images both leaves or soil or sky appear completely white This part of the processing allows masking these areas so that they will not be taken into account in the image processing There is also the possibility to apply a gamma correction factor to the image This correction is just a visual correction to help the user to better discriminate between vegetation and background It will not impact the model inversion results Note that will intend also to implement slope correction described in Espafia et al 2008 in a next release Once the pertinent images are selected the user gets access to the following window RESET DONE SLOPE Figure 11 Main masking window CAN EYE user manual Page 22 CAN EYE USER MANUAL NYA SCIENCE amp IMPACT GAMMA Gamma Correction Select this button to get access either to a frame where you can put you gamma value l
58. te 1 1 and the lower right corner is located at the NbLines NbRows Projection function the radius of the image in degree is considered as a polynomial function maximum order 3 of the distance between a pixel of the image to the optical centre if the degree of the polynomial is 1 then the projection is assumed to be polar Be very careful when entering the coefficient to sort them in the right order descending power Note that after entering the polynomial coefficient the polynomial function is plotted so that the user is able to check if it is correct Projection function characterization O x Optical Centre LINE 1458 COLUMN 2190 Projection function R P1L Ripix 3 P2 R2 pix P1 Ripix P1 b 00000001004 P2 0 00001131 P3 0 0613 Pohmomial Order ba T amp mm T c 45 E T rd 0 200 400 600 800 1000 1200 1400 Distance ta the optical centre pixels Figure 7 Creating projection function and optical centre characteristics CAN EYE user manual Page 17 Processing parameters for images at 57 5 This window allows defining all the camera characteristics field of view as well as the characteristics required for the processing gt gt User Name by default will display the user name environment system variable of the computer Comment add any useful comment will be written in the processing report amp output files Image size automatically fil
59. ur and latitude Depending on latitude CAN EYE computes the solar zenith angle every solar hour during half the day there is symmetry at 12 00 The instantaneous fAPAR is then approximated at each solar hour as the gap fraction in the corresponding solar zenith angle fAPAR 0 1 P 0 CAN EYE user manual Page 33 CAN EYE USER MANUAL NYA kemneren SCIENCE amp IMPACT 2 The daily integrated black sky or direct FAPAR is computed as the following sunrise cos 0 1 P 0 d APARB2 _ sunset sunrise cos 0 de sunset 3 The white sky or diffuse fAPAR is computed as the following 27 1 2 7 2 fAPAR 1 P 0 cos 0 sin 0d dp 2 1 P 0 cos sin 900 0 0 0 I 4 2 Description of CAN EYE output directory content During the processing a CAN EYE output subdirectory is automatically generated in the directory containing the images to be processed It contains an html report reporting the processing description a screen copy of the different steps Masking Classification Resulting graphs a table with the processing characteristics processing parameter used and a table containing the CAN EYE output variable estimates Hemispherical images DHP results are contained in a subdirectory called CE P180 xxx where xxx is the name of the directory that contains the images to be processed The html report is named CE P180 Report xxx html It contains 10 sections
60. ure below This can be very useful if the object to be masked is always at the same place in the images Figure 14 Applying a mask to all images UNDO Undo last mask When clicking on this button this erase the last mask the user has defined It is possible to click x times on this button to erase the last x masks that have been defined RESET Reset all the masks When clicking on this button this resets all the masks for all the images DONE End the masking Clicking on this button ends the masking process and leads to the following step image classification CAN EYE user manual Page 24 SLOPE Define slope This button will be available in a next release It is useful only if the images were not acquired on flat terrain The correction will be based on Espa a et al 2008 3 5 CLASS DEFINITION Once the masking step is achieved CAN EYE runs the image indexation this allws speeding the rest of the processing time The CLASS DEFINITION window allows indicating the way you intend to achieve the classification with CAN EYE You have several choices depending if you consider mixed pixels or not No Mixed Pixels 2 classes 2 Classes Mixed Pixels Classify pixels belonging to one class Classify pixels belonging to the two all the remaining pixels are assumed classes all the remaining pixels are to belong to the other class assumed to be mixed GREEN VEGETATION THIS OPTION No mixed pixels
61. values Mono Po averaged over all the images as well as individual mono directional gap fraction as a function of view zenith angle P57 results gap fraction at 57 and derived LAI for individual images and averaged over all the images Note that PAIS7 is an indicative value and should not be used since it does not fulfil the Poisson assumptions Average Bidir Po bi directional gap fraction averaged over all the images as a function of view zenith columns and azimuth rows angles gt Next sheets correspond to individual bio directional gap fraction as a function of view zenith columns and azimuth rows angles The sheet names corresponds to the image name Vv Vv V WV v Images acquired at 57 Images at 57 results are contained in a subdirectory called CE P57 xxx where xxx is the name of the directory that contains the images to be processed The html report is named CE P57 Report xxx html It contains 10 sections gt GENERAL INFORMATION CAN EYE version User processing Date Processing Duration gt CALIBRATION PARAMETERS image size resolution for clumping camera model CCD size sub sampling factor resolution for clumping gt SELECTED IMAGES table containing the list of the image file names used for the processing as well as the corresponding gap fraction at 57 estimates and corresponding PAI gt NUMBER OF CLASSES number of classes that was selected 2 or 3 2 mixed see 3 5 gt PROC
62. wo fixed directions Xp2 Yp2 cells C16 D16 and Xp3 Yp3 cells F16 G16 on Iml read the corresponding values x in cm cell E16 and H16 on the perpendicular ruler Do the same for Im2 cells E17 H17 It is then possible to compute the actual distance H if A is known sen x H gt tan O x H A eOnce the distance H is known the calibration of the projection function can be achieved if the coordinates on one of the 2 images select the one that is the most readable are associated to the actual distance read on the rulers The coordinates have to be read on the line passing through the optical axis for the three rulers This can be achieved for each cm tick The following equations are used to derive the angle from the values read on each ruler H A x x x o For the perpendicular ruler 0 arctan x H o For the lateral rulers 0 arctan W H y Therefore for the different reading in cm on the left lateral rulers cells A23 to AXX report the column number of the pixel in the image cells B23 to BXX Perform the same for the perpendicular and lateral rulers Background alignement Front nail Axis of the design alignement Perpandicular ruler 4 ywy Aa 7 i m y 7 Go i F fe i L 2 lt 3 ia Q a Camera Figure 25 Experimental design scheme CAN EYE user manual Page 44 CAN EYE USER MANUAL Background alignment PJ 3 Diameter of the image h
63. xels together colours with white bullet represent between 1 and 5 of all the pixels together and colours without bullet represent less than 1 of the images During the classification processed the colours are organised so that all the colours belonging to a given class are together in a frame with border of the same colour of the class For mixed pixels there is no border line and they are located at the bottom of the colour palette gt Menu 4 bottom right each class name is provided on a button Clicking on this button allows changing the colour attributed to the class This may be used to ease the classification process On the left on the class name round radio buttons are available Once he has selected the radio button the user 1s invited to click on pixels either on the image or in the palette that belong to this class Once the selection is achieved the user clicks on the right button to end pixels or colour selection All the pixel in the image that have the same colour as the ones previously selected are then classified in the chosen class If the user selects the square radio button located at the left of the class name he has to select a polygon same as in the masking process to force a whole part of an image to belong to the class without having an impact on all the pixels that are not included in this area This may be useful for example when some over exposed parts of leaves are classified as sky since they appear very bright
64. ying between 8 and 12 Note that all the results in the CAN EYE html report are provided for LA 10 Then the same algorithm as described previously for effective LAI 0 is applied by building a LUT using the modified Poisson model eq 7 to provide LAI and ALA as well as the corresponding clumping parameter LAI or PAI CAN EYE user manual Page 32 CAN EYE USER MANUAL NYA dro SCIENCE amp IMPACT Claiming that devices and associated methods based on gap fraction measurements provide an estimate of the leaf area index is not right since indirect measurements only allow assessing plant area index Indeed it is not possible to know if some leaves are present behind the stems branches or trunk Therefore masking some parts of the plants which is possible using CAN EYE to keep only the visible leaves 1s not correct and could lead to large under estimation of the actual LAI value depending on the way leaves are grouped with the other parts of the plant Therefore all CAN EYE outputs correspond to plant area index and not leaf area index Cover fraction computation Cover fraction fCover is defined as the fraction of the soil covered by the vegetation viewed in the nadir direction Eq 12 fCover R 0 Using hemispherical images it is not possible to get a value in the exact nadir direction and the cover fraction must be integrated over a range of zenith angles In CAN EYE the default value
Download Pdf Manuals
Related Search
Related Contents
Bryan Boilers RV-KD Series User's Manual Filtre Épurateur Krystal ClearTM Modèle 604 Samsung GS89F-1SP Benutzerhandbuch BBQ Pro Barrel Charcoal Smoker and Offset Smoker-- Consultar Recurso - Fe y Alegría RITCHEY Manual de instruções Introdução Antes da primeira viagem Sistemas de Telecomunicações Ascotel IntelliGate ST-SD20 User`s Guide Philips DCR5012 User's Manual Manual de instrucciones Copyright © All rights reserved.
Failed to retrieve file