Fathom to Micrometer Converter
Convert fathoms to micrometers with our free online length converter.
Quick Answer
1 Fathom = 1828800 micrometers
Formula: Fathom × conversion factor = Micrometer
Use the calculator below for instant, accurate conversions.
Our Accuracy Guarantee
All conversion formulas on UnitsConverter.io have been verified against NIST (National Institute of Standards and Technology) guidelines and international SI standards. Our calculations are accurate to 10 decimal places for standard conversions and use arbitrary precision arithmetic for astronomical units.
Fathom to Micrometer Calculator
How to Use the Fathom to Micrometer Calculator:
- Enter the value you want to convert in the 'From' field (Fathom).
- The converted value in Micrometer will appear automatically in the 'To' field.
- Use the dropdown menus to select different units within the Length category.
- Click the swap button (⇌) to reverse the conversion direction.
How to Convert Fathom to Micrometer: Step-by-Step Guide
Converting Fathom to Micrometer involves multiplying the value by a specific conversion factor, as shown in the formula below.
Formula:
1 Fathom = 1.8288e+6 micrometersExample Calculation:
Convert 10 fathoms: 10 × 1.8288e+6 = 1.8288e+7 micrometers
Disclaimer: For Reference Only
These conversion results are provided for informational purposes only. While we strive for accuracy, we make no guarantees regarding the precision of these results, especially for conversions involving extremely large or small numbers which may be subject to the inherent limitations of standard computer floating-point arithmetic.
Not for professional use. Results should be verified before use in any critical application. View our Terms of Service for more information.
Need to convert to other length units?
View all Length conversions →What is a Fathom and a Micrometer?
The Six-Foot Maritime Standard
The fathom is defined as exactly 6 feet in the imperial and U.S. customary measurement systems.
Precise equivalents:
- 6 feet (by definition)
- 2 yards (6 ft ÷ 3 ft/yd)
- 72 inches (6 ft × 12 in/ft)
- 1.8288 meters (exactly, using 1 ft = 0.3048 m)
- 182.88 centimeters
Historical basis: The arm span of an average man with arms fully outstretched, measured from fingertip to fingertip.
Arm Span Origins
Old English "fæthm":
- Primary meaning: To embrace, encircle with outstretched arms
- Secondary meaning: The distance between fingertips when arms are extended
Practical measurement: Sailors hauling in sounding lines (weighted ropes for measuring depth) would pull hand-over-hand, with each arm span representing one fathom. This created a natural counting method:
- Drop weighted line overboard
- Haul in, counting arm spans
- Number of arm spans = depth in fathoms
Standardization necessity: Since arm spans varied (5.5-6.5 feet typically), maritime commerce required a fixed standard. The British settled on exactly 6 feet, matching the standardized foot of 12 inches.
Nautical Charts and Depth Contours
Fathom lines: Nautical charts show depth contours (lines connecting points of equal depth) traditionally measured in fathoms.
Common contour intervals:
- 1, 2, 3, 5, 10 fathoms: Shallow coastal waters
- 20, 50, 100 fathoms: Coastal navigation
- 500, 1,000 fathoms: Deep ocean
Chart notation: Depths written as plain numbers on charts (e.g., "45") indicate 45 fathoms unless otherwise specified. Modern charts often include a note: "Depths in fathoms" or "Depths in meters."
Anchor Cable and Chain
Shackle: One "shackle" of anchor chain traditionally equals 15 fathoms (90 feet / 27.43 m) in the Royal Navy and many navies worldwide.
Anchoring depth rule: Ships typically anchor with a scope (ratio of chain length to water depth) of 5:1 to 7:1 for safety.
Example:
- Water depth: 10 fathoms (60 feet)
- Required chain: 50-70 fathoms (300-420 feet)
- That's 3.3 to 4.7 shackles
A micrometer is a unit of length in the metric system equal to one millionth (1/1,000,000) of a meter. The term derives from the Greek "mikros" (small) and "metron" (measure). It is abbreviated as μm, where μ (mu) is the Greek letter representing the prefix "micro-."
Note on terminology: While "micron" was widely used from 1879 to 1967, it was officially deprecated by the International System of Units (SI) in favor of "micrometer" to maintain consistent naming conventions. However, "micron" remains common in some industries, particularly semiconductor manufacturing and filtration.
The micrometer sits between the millimeter and nanometer on the metric scale:
- 1 meter = 1,000,000 micrometers
- 1 millimeter = 1,000 micrometers
- 1 micrometer = 1,000 nanometers
This scale makes micrometers perfect for measuring objects visible under optical microscopes but invisible to the naked eye.
Convert Micrometers to Other Units →
Note: The Fathom is part of the imperial/US customary system, primarily used in the US, UK, and Canada for everyday measurements. The Micrometer belongs to the metric (SI) system.
History of the Fathom and Micrometer
Ancient Maritime Practices (Pre-9th Century)
Mediterranean and Northern European sailors: Ancient mariners measured rope and depth using body-based units:
- Cubit: Elbow to fingertip (~18 inches)
- Pace: Two steps (~5 feet)
- Arm span: Outstretched arms (~6 feet)
Sounding lead: A heavy weight (lead sinker) attached to a marked line, dropped overboard to measure depth. Sailors counted arm spans as they hauled the line back aboard.
Old English Documentation (9th-11th Centuries)
Earliest references: Anglo-Saxon texts use "fæthm" for measuring rope lengths and describing distances.
Beowulf (8th-11th century): The epic poem mentions "fæthmas" in describing ocean depths and ship measurements.
Viking influence: Old Norse "faðmr" (similar arm-span measurement) influenced English usage through Viking contact and trade.
Medieval Standardization (13th-15th Centuries)
Edward I (1272-1307): English law under Edward I began standardizing measurements, including the fathom at 6 feet.
Admiralty regulations: The emerging Royal Navy needed consistent rope, sail, and depth measurements for shipbuilding and navigation.
Rope making: British rope makers sold cordage by the fathom, with standard lengths for anchor cables (120 fathoms = 1 cable length in some contexts).
Age of Exploration (15th-17th Centuries)
Navigation charts: Early nautical charts (portolan charts) began incorporating depth soundings in fathoms.
Captain James Cook (1768-1779): Cook's Pacific voyages produced meticulous charts with fathom-based depth measurements. His charts became templates for British Admiralty standards.
Example - HMS Endeavour soundings: Cook's logs record depths like "15 fathoms, sandy bottom" or "No bottom at 100 fathoms" (indicating depths exceeding 600 feet).
British Admiralty Charts (19th Century)
Hydrographic Office (founded 1795): The British Admiralty Hydrographic Office systematized global nautical chart production, standardizing fathoms for depth.
Matthew Fontaine Maury (1806-1873): American oceanographer Maury collaborated with the British to create standardized depth charts using fathoms, mapping ocean currents and depths.
Cable-laying expeditions: Transatlantic telegraph cable projects (1850s-1860s) required precise fathom-based depth surveys. HMS Agamemnon and USS Niagara charted the Atlantic floor in fathoms before laying the 1858 cable.
U.S. Navy Adoption (19th-20th Centuries)
Inherited British standards: The U.S. Navy adopted British maritime practices, including fathom-based charts and anchor cable measurements.
U.S. Coast and Geodetic Survey: Founded in 1807 (originally "Survey of the Coast"), it produced nautical charts in fathoms for American waters.
World War II: Submarine warfare and amphibious operations relied heavily on fathom-based depth charts. USS submarines operated in waters charted in fathoms.
Metrication Movement (20th Century-Present)
International Hydrographic Organization (IHO, founded 1921): Recommended global adoption of metric system for nautical charts.
Gradual transition:
- 1970s-1980s: Most nations began publishing new charts in meters
- UK Admiralty: Converted most charts to meters by the 1990s
- U.S. NOAA: Many American charts still use fathoms, particularly for coastal waters
Mixed usage today: Modern electronic chart systems (ECDIS) allow display in either fathoms or meters, accommodating mariners accustomed to either system.
The concept of the micrometer emerged alongside the development of precision microscopy in the 17th and 18th centuries. As scientists like Robert Hooke and Antonie van Leeuwenhoek observed cells and microorganisms for the first time, they needed standardized ways to describe these microscopic dimensions.
The term "micron" (μ) was officially adopted at the First International Electrical Congress in Paris in 1879 as a convenient shorthand for one millionth of a meter. This simplified notation became widely used in scientific literature, particularly in biology, materials science, and optics.
In 1960, the International System of Units (SI) was established to create consistent naming conventions across all units. By 1967-1968, the SI officially deprecated "micron" in favor of "micrometer" to align with the systematic naming structure where prefixes like "micro-," "nano-," and "kilo-" are clearly indicated.
Despite this official change, the term "micron" persists in several industries:
- Semiconductor manufacturing: Process nodes like "5-micron technology"
- Filtration systems: "10-micron water filter"
- Materials science: Particle size specifications
- Aerospace: Surface finish requirements
The symbol μm is universally recognized in scientific and technical documentation, combining the Greek letter μ (representing the micro- prefix meaning 10⁻⁶) with m for meter.
Today, micrometers are fundamental to numerous high-precision fields, from medical diagnostics and semiconductor fabrication to quality control and environmental monitoring.
Common Uses and Applications: fathoms vs micrometers
Explore the typical applications for both Fathom (imperial/US) and Micrometer (metric) to understand their common contexts.
Common Uses for fathoms
1. Nautical Charts and Hydrography
Depth soundings: Nautical charts mark depths in fathoms, particularly on U.S. and older British charts.
Contour lines: Lines connecting equal depths (e.g., the 10-fathom line) help mariners avoid shallow areas.
Chart abbreviations:
- fms: Fathoms
- fm: Fathom
- No bottom at 100 fms: Depth exceeds 100 fathoms (600 feet)
2. Anchoring and Mooring
Anchor scope: Mariners calculate how much anchor chain to deploy based on water depth in fathoms.
Rule of thumb: Deploy 5-7 times the water depth in calm conditions, 7-10 times in storms.
Example:
- Depth: 8 fathoms
- Calm weather scope (5:1): 40 fathoms of chain
- Storm scope (10:1): 80 fathoms of chain
3. Commercial Fishing
Net depth: Fishermen describe trawl net depths in fathoms.
Example: "Running trawl at 50 fathoms" (300 feet deep)
Fishing line: Deep-sea fishing lines measured in fathoms to target specific depths.
4. Recreational Boating and Diving
Depth sounders: Many recreational boat depth finders display fathoms (though meters and feet are increasingly common).
Dive planning: Divers reference depth in fathoms on nautical charts when planning dive sites.
5. Submarine Operations
Periscope depth: Submarines traditionally use fathoms for depth control.
Example: "Dive to 20 fathoms" (120 feet)
Historical note: WWII submarine logs recorded depths in fathoms; modern submarines use meters.
6. Maritime Literature and Tradition
Nautical expressions:
- "To fathom something" = to understand its depth (metaphorically)
- "Unfathomable" = too deep to measure or comprehend
Sailing instructions: Traditional pilot books use fathoms for approach depths and anchorage recommendations.
When to Use micrometers
1. Microscopy and Biology
Micrometers are the standard unit for measuring cells, bacteria, and other microorganisms under optical microscopes. Lab technicians and researchers use calibrated eyepiece scales marked in micrometers to measure biological specimens. Cell biology, microbiology, and histology all depend on micrometer measurements for specimen identification and analysis.
2. Semiconductor Manufacturing
The semiconductor industry uses micrometers (often called "microns") to specify process node sizes, though modern chips have moved to nanometer scales. Wafer thickness (typically 725 μm for 300mm wafers), photoresist layers, and older chip features are measured in micrometers. Quality control requires precise measurements to ensure manufacturing tolerances.
3. Precision Engineering
Manufacturing engineers specify tolerances in micrometers for high-precision components. CNC machining, grinding, and polishing operations achieve accuracies of ±1-10 μm. Measuring instruments like micrometers (the tool) can measure to 0.001 mm = 1 μm precision. Critical aerospace, medical device, and automotive components require micrometer-level quality control.
4. Fiber Optics and Telecommunications
Fiber optic cables have core diameters measured in micrometers: single-mode fibers typically use 8-10 μm cores, while multi-mode fibers range from 50-62.5 μm. The precise core diameter determines light transmission characteristics, bandwidth, and distance capabilities. Telecom technicians reference these specifications when installing and troubleshooting fiber networks.
5. Filtration and Air Quality
Filter manufacturers rate products by the size of particles they capture, measured in micrometers. HEPA filters capture 99.97% of particles ≥0.3 μm. Water filters, air purifiers, and industrial filtration systems all use micrometer ratings. Environmental agencies track PM2.5 (particulate matter <2.5 μm) and PM10 pollution, which pose respiratory health risks.
6. Medical Diagnostics
Medical laboratories measure blood cells in micrometers: red blood cells average 6-8 μm, while variations may indicate conditions like anemia. Pathologists examine tissue samples and tumor margins at micrometer scale. Medical device manufacturing (catheters, needles, implants) requires micrometer-precision specifications for safety and efficacy.
Convert Medical Measurements →
7. Surface Finish and Coatings
Surface roughness is measured in micrometers using parameters like Ra (average roughness). A mirror finish might be <0.1 μm Ra, while machined surfaces range from 0.8-25 μm Ra. Coating thickness—paint, anodizing, plating—is specified in micrometers to ensure corrosion protection and aesthetic quality.
Additional Unit Information
About Fathom (fath)
How many feet are in a fathom?
Exactly 6 feet = 1 fathom.
This is the defining relationship. The fathom was standardized to 6 feet during medieval English measurement standardization.
How many meters are in a fathom?
1 fathom = 1.8288 meters (exactly).
This conversion uses the international foot definition: 1 foot = 0.3048 meters (exactly).
Calculation: 6 feet × 0.3048 m/ft = 1.8288 m
Is the fathom an SI unit?
No, the fathom is not an SI unit.
It belongs to the imperial and U.S. customary systems. The SI unit of length is the meter.
International usage: The International Hydrographic Organization recommends meters for nautical charts, but fathoms remain legal and common in U.S. and some British waters.
Is the fathom still commonly used today?
Yes, in specific maritime contexts, especially in the United States.
Still common:
- U.S. NOAA nautical charts (many coastal charts)
- Recreational boating in the U.S.
- Commercial fishing fleets
- Maritime tradition and literature
Declining usage:
- International shipping (uses meters)
- Most modern navies (switched to meters)
- New chart production (increasingly metric)
Result: Fathoms persist in American waters and traditional maritime communities but are gradually being replaced by meters in international contexts.
Where does the word "fathom" come from?
From Old English "fæthm" (outstretched arms, embrace).
Etymology:
- Proto-Germanic: *faþmaz (embrace, armful)
- Old English: fæthm (span of outstretched arms)
- Middle English: fadme, fathme
- Modern English: fathom
Original meaning: The distance between fingertips when a person extends both arms horizontally—roughly 6 feet for an average man.
Verb form: "To fathom" originally meant "to measure depth with outstretched arms," later metaphorically "to comprehend deeply" (exploring the depths of understanding).
Why are anchor chains measured in shackles, not fathoms?
Both are used, but shackles are standard for large vessels.
Shackle definition: 1 shackle = 15 fathoms = 90 feet = 27.43 meters
Reason: Anchor chains are physically connected with shackle links every 15 fathoms. These physical shackles allow disconnection for maintenance and provide visual/tactile markers when deploying chain.
Usage:
- Small vessels: Anchor chain length in fathoms
- Large vessels and navies: Anchor chain length in shackles
Example: "Deploy 5 shackles" = 75 fathoms = 450 feet of chain
How deep is "full fathom five"?
5 fathoms = 30 feet = 9.144 meters.
Shakespeare's The Tempest: Ariel's song describes a drowned man lying at the bottom, 5 fathoms below the surface.
Context: 30 feet is deep enough that:
- Surface light barely reaches the body
- Free diving without equipment is challenging
- The body would be difficult to recover without specialized equipment
This depth creates the eerie, unreachable quality of Ariel's description.
Can I convert my depth sounder from fathoms to meters?
Yes, most modern depth sounders (fishfinders, chartplotters) allow unit selection.
Typical options:
- Feet
- Fathoms
- Meters
How to change (general steps):
- Access settings menu
- Find "Units" or "Depth Units"
- Select preferred unit (fathoms, feet, or meters)
- Save settings
Check manual: Specific instructions vary by manufacturer (Garmin, Lowrance, Raymarine, Furuno, etc.).
What's the difference between fathoms and cable lengths?
Both are nautical length units, but they measure different things:
Fathom:
- 6 feet / 1.8288 meters
- Primarily for depth measurement
Cable length:
- UK: 608 feet = 185.3 meters (1/10 nautical mile)
- US (historical): 720 feet = 219.5 meters (120 fathoms)
- Primarily for horizontal distance (anchor cable, ship-to-ship spacing)
Confusion: The term "cable" sometimes referred to 100 or 120 fathoms of anchor cable, but the standardized "cable length" unit differs from this.
Do submarines still use fathoms?
Historically yes, but modern submarines use meters.
World War II era: U.S. and British submarines recorded depths in fathoms (e.g., "Dive to 50 fathoms").
Modern practice:
- U.S. Navy: Switched to feet and meters for submarine operations
- International: Nearly all modern navies use meters
Reason for change: International standardization, digital instrumentation, and NATO interoperability drove metrication.
About Micrometer (μm)
Is a micrometer the same as a micron?
Yes, micrometer and micron refer to the same unit: one millionth of a meter (1×10⁻⁶ m or 1 μm). The term "micron" (symbol: μ) was officially used from 1879 to 1967 but was deprecated by the International System of Units (SI) in favor of "micrometer" to maintain consistent naming conventions.
Despite being officially deprecated, "micron" remains common in several industries:
- Semiconductor manufacturing ("5-micron process")
- Filtration ("10-micron filter")
- Materials science (particle size specifications)
In scientific and technical writing, "micrometer" (μm) is the preferred term, but both are universally understood.
How many micrometers are in a millimeter?
There are 1,000 micrometers (μm) in 1 millimeter (mm). This makes sense when you consider the metric prefixes:
- "Milli-" means one thousandth (1/1,000)
- "Micro-" means one millionth (1/1,000,000)
Since a micrometer is 1,000 times smaller than a millimeter, dividing 1 mm into 1,000 equal parts gives you 1 μm per part.
Examples:
- 0.5 mm = 500 μm
- 0.1 mm = 100 μm
- 0.075 mm = 75 μm (typical human hair)
Convert Millimeters to Micrometers →
What are some examples of things measured in micrometers?
Biological:
- Bacteria: 1-10 μm (E. coli ≈ 2 μm)
- Red blood cells: 6-8 μm
- Human hair diameter: 50-100 μm
- Pollen grains: 10-100 μm
Technology:
- Fiber optic core: 8-62.5 μm (depending on type)
- Semiconductor features: 0.01-10 μm (older processes)
- Surface roughness: 0.1-25 μm (machining)
Materials:
- Paint thickness: 25-100 μm
- Plastic wrap: 10-15 μm
- Paper thickness: 70-100 μm
Essentially, anything visible under an optical microscope but invisible to the naked eye is measured in micrometers.
How do I convert micrometers to inches?
To convert micrometers to inches, multiply by 0.00003937 (or divide by 25,400).
Formula: inches = micrometers × 0.00003937
Examples:
- 100 μm × 0.00003937 = 0.003937 inches (≈ 0.004")
- 1,000 μm × 0.00003937 = 0.03937 inches (≈ 0.04")
- 2,540 μm × 0.00003937 = 0.1 inches
For context, 1 inch = 25,400 μm (or 25.4 mm), so micrometers are extremely small when expressed in imperial units.
Convert Micrometers to Inches →
Can the human eye see micrometers?
The human eye's resolution limit is approximately 50-100 micrometers under ideal conditions. This means:
Barely visible (with perfect vision):
- Thick human hair: 100 μm
- Fine sand grains: 100-500 μm
- Large dust particles: 100+ μm
Invisible without magnification:
- Bacteria: 1-10 μm
- Red blood cells: 6-8 μm
- Fine dust: <50 μm
- Most microorganisms: <50 μm
To see objects smaller than ~50 μm, you need a microscope. Optical microscopes can resolve features down to about 0.2 μm (200 nm), while electron microscopes can see structures at the nanometer scale.
What is the difference between micrometer and nanometer?
A micrometer (μm) equals one millionth of a meter (10⁻⁶ m), while a nanometer (nm) equals one billionth of a meter (10⁻⁹ m). This means 1 micrometer = 1,000 nanometers.
Scale comparison:
- Micrometer scale: bacteria, cells, human hair (1-100 μm)
- Nanometer scale: viruses, molecules, atoms (1-100 nm)
Examples:
- Red blood cell: 7,000 nm = 7 μm
- Coronavirus particle: 100 nm = 0.1 μm
- DNA helix width: 2 nm = 0.002 μm
- Silicon atom: 0.2 nm = 0.0002 μm
Optical microscopes work at the micrometer scale, while electron microscopes are needed for nanometer-scale imaging.
Convert Micrometers to Nanometers →
How accurate are micrometer measuring tools?
A micrometer (the measuring instrument, also called a "mike") typically measures with an accuracy of ±0.001 mm (±1 μm) for standard models, and ±0.0001 mm (±0.1 μm) for digital precision models.
Types and accuracy:
- Standard mechanical: ±0.001 mm (±1 μm)
- Vernier micrometer: ±0.001 mm (±1 μm)
- Digital micrometer: ±0.0005-0.001 mm (±0.5-1 μm)
- High-precision digital: ±0.0001 mm (±0.1 μm)
Accuracy depends on:
- Tool quality and calibration
- Temperature (thermal expansion affects readings)
- Operator technique (proper force and reading)
- Workpiece surface condition
For even higher precision, coordinate measuring machines (CMMs) and optical comparators can achieve sub-micrometer accuracy in controlled environments.
Why was "micron" deprecated?
The International System of Units (SI) deprecated "micron" in 1967-1968 to maintain consistent naming conventions across all metric units. The SI system uses standard prefixes (micro-, nano-, kilo-, etc.) combined with base units (meter, gram, second) to create derived units.
Reasons for change:
- Consistency: "Micrometer" follows the pattern of millimeter, nanometer, kilometer
- Clarity: Combines "micro-" (10⁻⁶) with "meter" to clearly indicate the scale
- International standardization: Reduces confusion in scientific communication
- Symbol standardization: μm is unambiguous, while μ alone could be confused with other uses
Why "micron" persists:
- Shorter and easier to say ("micron" vs "micrometer")
- Decades of industry usage before 1967
- Well-established in semiconductor, filtration, and materials industries
- No confusion in context (everyone knows what "10-micron filter" means)
In formal scientific writing, use "micrometer (μm)" for SI compliance.
What equipment measures in micrometers?
Precision measuring instruments:
- Micrometer caliper (the tool): Measures dimensions to ±1 μm accuracy
- Dial indicator: Measures displacement to ±1-5 μm
- Coordinate Measuring Machine (CMM): Sub-micrometer accuracy
- Optical comparator: Projects magnified image for micrometer-scale inspection
- Laser interferometer: Measures to nanometer/sub-micrometer accuracy
Microscopy equipment:
- Optical microscope: With calibrated eyepiece scales (reticles) marked in micrometers
- Confocal microscope: 3D imaging with micrometer resolution
- Scanning Electron Microscope (SEM): Nanometer resolution but calibrated in micrometers
Surface analysis:
- Surface roughness tester (profilometer): Measures Ra, Rz in micrometers
- Thickness gauge: Coating thickness to ±1 μm
- Film thickness measurement: Non-contact optical methods
Quality control:
- Particle size analyzers: Measure suspended particles in micrometers
- Laser diffraction instruments: Characterize powders and emulsions
How is micrometer used in air quality standards?
Air quality standards use micrometers to classify particulate matter (PM) by size, which determines health impacts:
PM10 (Particulate Matter <10 μm):
- Includes dust, pollen, mold
- Can reach lungs but often trapped in nose/throat
- EPA 24-hour standard: 150 μg/m³
PM2.5 (Particulate Matter <2.5 μm):
- Includes combustion particles, smoke, fine dust
- Small enough to enter deep into lungs and bloodstream
- EPA 24-hour standard: 35 μg/m³
- More dangerous than PM10 due to deep lung penetration
Why size matters:
- >10 μm: Trapped in nose and throat
- 2.5-10 μm: Can reach upper respiratory tract and lungs
- <2.5 μm: Can penetrate deep into lungs and enter bloodstream
- <0.1 μm (ultrafine): Can cross into organs and brain
Filter effectiveness:
- HEPA filters: Capture 99.97% of particles ≥0.3 μm
- N95 masks: Filter 95% of particles ≥0.3 μm
- Standard HVAC filters: Typically 3-10 μm particle capture
Understanding micrometer-scale particle sizes is critical for respiratory health, especially for vulnerable populations.
Convert Air Quality Measurements →
Conversion Table: Fathom to Micrometer
| Fathom (fath) | Micrometer (μm) |
|---|---|
| 0.5 | 914,400 |
| 1 | 1,828,800 |
| 1.5 | 2,743,200 |
| 2 | 3,657,600 |
| 5 | 9,144,000 |
| 10 | 18,288,000 |
| 25 | 45,720,000 |
| 50 | 91,440,000 |
| 100 | 182,880,000 |
| 250 | 457,200,000 |
| 500 | 914,400,000 |
| 1,000 | 1,828,800,000 |
People Also Ask
How do I convert Fathom to Micrometer?
To convert Fathom to Micrometer, enter the value in Fathom in the calculator above. The conversion will happen automatically. Use our free online converter for instant and accurate results. You can also visit our length converter page to convert between other units in this category.
Learn more →What is the conversion factor from Fathom to Micrometer?
The conversion factor depends on the specific relationship between Fathom and Micrometer. You can find the exact conversion formula and factor on this page. Our calculator handles all calculations automatically. See the conversion table above for common values.
Can I convert Micrometer back to Fathom?
Yes! You can easily convert Micrometer back to Fathom by using the swap button (⇌) in the calculator above, or by visiting our Micrometer to Fathom converter page. You can also explore other length conversions on our category page.
Learn more →What are common uses for Fathom and Micrometer?
Fathom and Micrometer are both standard units used in length measurements. They are commonly used in various applications including engineering, construction, cooking, and scientific research. Browse our length converter for more conversion options.
For more length conversion questions, visit our FAQ page or explore our conversion guides.
Helpful Conversion Guides
Learn more about unit conversion with our comprehensive guides:
📚 How to Convert Units
Step-by-step guide to unit conversion with practical examples.
🔢 Conversion Formulas
Essential formulas for length and other conversions.
⚖️ Metric vs Imperial
Understand the differences between measurement systems.
⚠️ Common Mistakes
Learn about frequent errors and how to avoid them.
All Length Conversions
Other Length Units and Conversions
Explore other length units and their conversion options:
- Meter (m) • Fathom to Meter
- Kilometer (km) • Fathom to Kilometer
- Hectometer (hm) • Fathom to Hectometer
- Decimeter (dm) • Fathom to Decimeter
- Centimeter (cm) • Fathom to Centimeter
- Millimeter (mm) • Fathom to Millimeter
- Inch (in) • Fathom to Inch
- Foot (ft) • Fathom to Foot
- Yard (yd) • Fathom to Yard
- Mile (mi) • Fathom to Mile
Verified Against Authority Standards
All conversion formulas have been verified against international standards and authoritative sources to ensure maximum accuracy and reliability.
National Institute of Standards and Technology — Official US standards for length measurements
Bureau International des Poids et Mesures — International System of Units official documentation
Last verified: December 3, 2025