Frequently Asked Questions
- Calibration Pass or Fail Status
- How do I arrange for calibration service?
- How do I obtain a quote for calibration of my equipment?
- Having Trouble Accessing My Certificates Online
- Link to their discussion forum
- Link to their Documents Library
- What are “SYSTEMS”?
- What is a “START” program?
- SMS Driver Issue
- Why has my magnification changed?
- How do I Export?
- How many points should I take?
- Definition of Features
Newage Hardness Testing
- Hardness Testing Reference Information - this site has many useful articles and reference information. They also have a complete listing of all ASTM documents relating to Hardness Testing. (Also see our Metallury Reference Information Page)
Calibration Certificates issued by Assurance Technologies, Inc. may not have the ‘Pass’ or ‘Fail’ box checked. The calibration of a device may not be possible to meet original factory specifications for a new machine or in some cases, the original factory specification may not be known. As an example, the maximum deviations of a device may not be acceptable for measurements with a tolerance of ± .001”, but may be acceptable for measurements with a tolerance of ± .010”. In these cases, it is the responsibility of the customer to indicate ‘Pass’ or ‘Fail’.
Here are the guidelines used by ATI Service Engineers.
- For calibrations performed in accordance with national and international standards that specify performance criteria, the Service Engineer will check the appropriate ‘Pass’ of ‘Fail’ box on the calibration certificate. An example of this is the calibration of Rockwell type Hardness Testers to ASTM E-18.
- For calibrations performed by ATI that do not have national or international standards that specify performance criteria, the device is calibrated by ATI using procedures developed from manufacturer’s specifications and/or accepted industry practices and/or customer requirements. These Calibration Certificates define and document the deviation found, but the Service Engineer will not indicate ‘Pass’ of ‘Fail’ on the calibration certificate. If the customer provides the performance criteria at the time of calibration, the Service Engineer will check the appropriate ‘Pass’ or ‘Fail’ box and make a notation in the Comments section of the calibration certificate.
Calibration is generally either onsite at the customer's facility or here at our lab.
For our current customers:
Onsite Calibration - One of our Service Engineers will contact you 30 to 45 days in advance of the calibration date to schedule service of your current equipment. Contact our Metrology Manager on ext. 156 should you have any questions.
Calibration @ ATI Lab - A letter is sent 45 days prior to the calibration date notifying you of equipment due for calibration. Contact our Lab Technicians on ext. 120 should you have any questions.
For new or current customers
Onsite Calibrations - Please call our Service Desk at ext. 100. A member of our team will take your message and contact your Service Engineer who will return your call within 24 hours to schedule the calibration. From time to time, emergencies do arise. When this happens, please let us know that it's an emergency and we will call the Service Engineer immediately with the specifics of your situation. Because our Service Engineers are often at customer locations, we may not be able to reach them immediately; however, every effort will be made to take care of your emergency as quickly as possible.
Calibration @ ATI Lab - Please call our Lab Technicians on ext. 120. They will be happy to answer any of your questions and provide you with a quotation.
Assurance Technologies, Inc. is happy to provide written quotations for your calibration or service needs. Please send the following information to us by email to Service@ATIQuality.com or by FAX to (630) 550-5001.
- Your name
- Company name
- Phone number
- Fax number
- Email address
- Type (i.e. micrometer, optical comparator, hardness tester)
- Model number
- Serial number (if applicable)
- Desired calibration points (if appropriate)
- Description of any problems (if applicable)
Systems are the ‘Frame of Reference’ by which we measure. If I were to hold up a common business card (3.5” x 2” x .02”) and ask you ‘How wide is it?’, your answer might be 3.5” or 2” or .02” depending on how I was holding it. If I were holding it at angle, the correct answer could even be more varied. The same holds true for your measurement machine. To ask the question ‘How wide is it?’, requires you to define how you are looking at your part. Systems define the part orientation to the machine. Systems are composed of three parts; Level, Skew and Origin.
Is your part flat to the camera? Leveling allows you to mathematically set your part perpendicular to the camera. When your equipment is calibrated, your Service Engineer sets the Stage Glass level to the camera. If your part sits flat on the glass, Leveling may not be required.
Is your part squared up to the stage travel? When you move your machine in the X or Y direction, it should move along in the X or Y direction as defined on you part drawing. Skewing mathematically rotates your part to line up with an axis of the machine. You may not always require a Skew. For example, the diameters circles do not change if you rotate your part, but its XY location will change.
Where is your part? Origins establish where, in X, Y & Z, your part is located. These are usually taken from the datums on your part drawing. Origins can be made in X, Y and Z or any combination. Not all measurements require an origin. For example, the diameters do not change if the part is moved.
When you are writing a program using a MicroVu Automated Vision system, the machine will measure automatically after your first system has been established. You can choose to measure this system every time you inspect a part, or with the use of a Stage Square, the measurement can start automatically after you click the RUN icon. Click here for the procedure on writing and using a START program.
MicroVu InSpec software is not compatible with SMS Mirror Driver software. SMS Mirror Driver is used by some IT departments to remotely access a computer and troubleshoot various problems. Unfortunately it cannot coexist with the Matrox drivers used in the InSpec software. Please click here for a document describing how to identify the problem.
MicroVu’s InSpec 2.X Metrology Vision Software reports the true magnification at the top of the video window. Through ‘Plug and Play’, the software knows what size of monitor you are using. If you change your monitor or if you change the resolution of your display, the magnification reported on your screen may change. You may see that at your lowest zoom level the magnification was reported at 18X, but when you install a larger monitor, you will be at 20X. This is reference to the size of your part on the monitor, but it does not affect accuracy of your measurements. Please call the Metrology Manager at extension 156 if you have any questions.
Exporting with MicroVu’s InSpec for Windows software takes place in two parts; setting up the features and setting up the file. Here is a brief description of the process. Please consult your software manual for detailed instructions.
Setting up the Features
After you have written and saved your inspection program, Right-Click, in the Feature List, on a feature you wish to Export, the select EXPORT. Check the boxes for the properties you wish to Export. Repeat this for each feature and then save your program.
Setting up the File for a ‘One Time’ Export
Click on the FILE menu at the top of the screen, and then select EXPORT. Select the filename, location and delimiters for your file, then select OK. A text file will be created.
Setting up a File for automatic Exporting
Click on the TOOLS menu at the top of the screen, and then select PLAYBACK OPTIONS. Select EXPORT FILE. Choose either ‘After Run’ (data is exported immediately after the program finishes) or ‘After Run Confirm’ (a window comes up when the program finishes asking if you would like to export the data.) Select the filename, location and delimiters for your file. Save your program.
The number of points you use to define a feature is dependant on a number of factors. Ideally, the more points you use, the more ‘Statistically Valid’ your measurement will be; however, the more points you take, the longer it will take for you to measure your parts. When deciding how many points to take, you need to consider a number of factors and use sound judgment. Here are some basic guidelines.
- For every type of feature there is a specific minimum number of points required <click here for a chart>. Using the ‘MORE INPUTS’ icon allows you to increase this value.
- Experiment. If you are measuring circles using the minimum of three points and your values are not repeating, try increasing the number of points until you obtain repeatable readings. Circles are never perfectly round and lines are never perfectly straight.
- The more points you take, the less variance you will see.
- Consider your tolerances. Increasing the number of points for measurements with a large tolerance zone may not significantly add value to your measurements.
- Consider your parts. If your edges have a lot of variance, it is wise to take more points.
If you are using an automated machine with vision, always try to use a Field Tool, whenever possible, rather than using single points. A Field Tool samples an edge taking hundreds of data points.