Weller RT tips are a beautiful piece of engineering. Some people have gone to the effort of tearing them apart  . They embed in a small form factor both a heating element, and a thermocouple, that they expose on just 3 contacts of a 3.5mm jack connector.
A thermocouple makes use of the Seebek effect by which two conductors of different materials joined at one end only will produce a voltage difference at the other end directly proportional to the temperature difference across their length. Thermocouples are arranged by types depending on the pairs of materials used.
That being said, I would like to talk about temperature accuracy. This is probably something on which I spent more time that I thought I would, to be honest. The reason for that is quite simple: there is just not that much information laying around on what type of thermocouple is inside a Weller RT tip.
I found contradicting suggestions that the thermocouple inside could be type-D or the more common type-K. But even if I knew the type of thermocouple, I still had to find a way to ensure the measurements were correct. So I thought for a while, how could I calibrate something that is going to be used at over 300°C?
My initial idea was to use the melting and boiling points of water, putting aside variations in atmospheric pressure, and hoping that the thermocouple response would be linear enough so that a two-points calibration at 0°C and 100°C would still be applicable all the way up to 400°C.
Then I stumbled upon the Hakko FG-100, a calibration device specifically designed for soldering irons, that advertises a +/- 3°C tolerance. I made a few measurements, and determined a linear law to adjust the readings of the Maxim MAX31855 type-K thermocouple to digital converter currently used.
Here is a video showing the final result. Note that the device was powered from 12W supply so that the rising time would be slow enough to see the accuracy over the whole range.