For glove buyers, glove performance has traditionally been measured with millimeter thickness. What many fail to realize is during the manufacturing process, gloves are produced in such a way that they are thinnest in the wrist area and thickest at the fingertips. Unfortunately, this manufacturing process makes the glove thickness test inaccurate because of the size variations of every glove produced. At the same time, gloves are made so differently the millimeter thickness measurements will always vary based on which part of the glove is measured.
The Jump from Millimeters to Grams
Before 2010, gloves were cheaply manufactured in China where the quality of gloves made were so poor, the gloves did not provide sufficient barrier protection rendering them useless. The glove industry responded by measuring gloves in grams rather than millimeters. Gram weight is the more accurate statistic for the amount of material used in a glove. Now, most lightweight gloves provide barrier protection for the hand making them popular throughout the United States. Also, gloves are going from being labeled as “thin” and “thick” to lightweight and heavy-duty. Since digital gram scales have been introduced to the glove industry, there is a clear path for glove purchasing decisions because certain glove weights may be better suited for specific applications (construction, farming, package-handling) as opposed to others. The slow move to measuring in grams instead of millimeters will help standardize and regulate the glove industry so that end-users can appreciate the finished product. Overall, purchasing an affordable gram scale is going to be more cost-effective than buying a micrometer.
Example of a Glove on a Digital Gram Scale
Most industrial gloves come with 0.3 ± gram variation. If your gloves do not, you might consider purchasing some of these ISO 9001 | FDA Approved Disposable Gloves.