Lab-grown diamonds, also known as synthetic, cultivated, or man-made diamonds, have their roots in the mid-20th century. The first commercially successful attempts at creating these diamonds in a laboratory setting began in the 1950s. The General Electric Company, through a research project led by H. Tracy Hall, developed the first reproducible process for creating diamonds in the lab in 1954. The process involved a high pressure, high temperature (HPHT) method, which mimicked the natural conditions under which diamonds are formed deep within the Earth.
Although lab-grown diamonds produced during this time were mainly used for industrial applications such as cutting and grinding tools, the process marked the beginning of a significant technological revolution. This innovation opened up new possibilities for creating high-quality, gem-grade diamonds, which would be realised decades later.
In the 1980s, further advancements were made in the process of creating synthetic diamonds. These developments introduced the chemical vapour deposition (CVD) method, a process that involves breaking down gas molecules and reassembling them in diamond structure. This method allows for better control over the diamond's properties, making it more suitable for the creation of gem-quality diamonds.
Despite the technical advancements, the production of lab-grown diamonds for the jewellery industry didn't take off until the 21st century. These diamonds are almost indistinguishable from their natural counterparts in terms of physical properties and appearance. Today, lab-grown diamonds represent a significant portion of the market, and their popularity continues to grow due to their ethical, environmental, and cost benefits compared to mined diamonds.