For centuries charcoal was one of three resources that went into making iron. The other two resources were iron ore and a flux, usually limestone. Of the three, charcoal was the only renewable resource, but also the most expensive one to acquire. It could not simply be mined, but was created through a time-consuming, delicate process. An immense amount of charcoal was required to keep a furnace the size of Hopewell's running. When it was "in blast," the furnace would consume as much as 800 bushels of charcoal per day.
Using charcoal to make iron was a processes that came to America from England. By the 1770s, when Hopewell Furnace began making iron, charcoal was the only fuel available. The 19th century, however, brought experiments in the process through the use of anthracite or "hard" coal in place of charcoal. In 1837 a Pennsylvania furnace was successful in producing iron using anthracite coal with the use of a hot air blast. Though iron production with anthracite coal was briefly tried at Hopewell during the 1850s, it did not prove to be economically viable due to the added expense of hauling it to the furnace. Hopewell continued to operate as a charcoal furnace for over four decades after most of the iron industry shifted from charcoal to anthracite coal, but in 1883 Hopewell Furnace "blew out" for the final time, losing in business to the newer anthracite fired furnaces.