One of the many amazing aspects of today's fiber-optic connectors is the mechanical precision that allows low-loss mated pairs, especially considering that single-mode fiber has an optical core of about 9 microns in diameter. Two of these cores, from opposing fibers, must be aligned with almost-perfect mechanical precision to affect the least amount of optical loss.
Positive contact and angled polish keep losses and reflections to a minimum. Of course, this small core size means that the slightest speck of dust on the single-mode core can introduce disastrously high levels of loss, so absolute cleanliness (cleaning the end faces of the connector ferrules) is a must every time a connection is made.
It was not always this good. Quite a few years ago, optical fiber networking was becoming mainstream in the LAN arena. At that time, 50/125, 100/140, and later, 62.5/125 (core and cladding diameter in microns), multimode optical fiber sizes were the short-distance LAN choices. The large diameter core allowed optical launch using LEDs, rather than more costly and finicky lasers.
The 906 SMA optical connector was one of the first popular optical connectors, but it had a few annoyances. For example, a "Delrin sleeve" (hollow plastic cylinder) was required to mechanically align the ferrules of a mated pair, precisely enough that the loss should remain under 3 dB. Being detachable, these Delrin sleeves had a habit of falling on the floor and rolling under anything that prevented someone from ever finding them again.
Compared to today's connectors, with losses of a fraction of a dB, this 3dB loss spec was atrocious. But it was a fact of life, and we had to live with it.
One day, our lab technician was setting up the optical portion of our corporate network and ran into difficulty. He came to me for help because he was getting unexplained bit errors over all the optical links. We had just started manufacturing an Ethernet 10Base-T and FOIRL (Fiber Optic Inter Repeater Link) hub, but he was trying to use Cabletron optical transceivers at the desktop ends. Plus, he had set up an optical patch panel that added an extra mated pair per patch cord.
I had designed the optical cards for our own product. In spite of much complaining from production, I insisted on tweaks to ensure that the green "Link Good" indicator LED would not turn on unless the incoming optical signal level was sufficient to ensure a bit error rate of better than 10-9. This was a requirement of the FOIRL Specification.
The Cabletron folks did not want expensive tweaks, so they designed their gear with a "Link Good" indicator LED that turned on with the slightest optical input -- never mind that it was so weak that the bit error rate made the link unusable -- thus, their gear was not FOIRL compliant.
Our lab technician was fooled by the Cabletron desktop transceivers "Link Good" LEDs. I showed him, with my optical power meter, that the link was not good, and that the Cabletron LEDs were lying to him. Then I told him the only solution was to rearrange his patch panel to eliminate one of the two optical couplings -- use cord and splice bushing only, rather than splice bushing to patch cords to another splice bushing. This eliminated 2 dB to 3 dB of loss and the links worked.
Note: 3 dB of optical loss is equivalent to 6 dB of electrical loss.