this post was submitted on 19 Jun 2023
16 points (100.0% liked)

Ask Electronics

3316 readers
2 users here now

For questions about component-level electronic circuits, tools and equipment.

Rules

1: Be nice.

2: Be on-topic (eg: Electronic, not electrical).

3: No commercial stuff, buying, selling or valuations.

4: Be safe.


founded 1 year ago
MODERATORS
 

Hello, we are making a thesis where we use MOSFETs as an alternative radiation detector. So to explain it, it works when the mosfet is irradiated with an external radiation source; its voltage threshold increases, which will be used to determine the radiation dose. I'm currently asking for help on how we measure the voltage threshold. BTW, we are using an n-channel MOSFET (model: IRFP250NPbF). Also in the datasheet provided by the manufacturer, it says here VGS(th)/Gate Threshold Voltage Min: 2.0 ––– Max: 4.0 V. There is a condition here with VDS = VGS, ID = 250 A. Does this mean that to measure the VGS, we need to first satisfy the conditions? To measure the voltage threshold, what node will we use to measure the VGS (th)? Is it at the drain to the source terminal or still at the gate to the source terminal? Feel free to share your thoughts, if you have any. I would also like to add that we have already tried to supply a voltage at the gate with respect to the source terminal. We use a 4 V supply voltage, and when we tried to measure the VDS (drain to source voltage), there was a voltage drop, so we've got a 3.5 V. Also, we are using an Arduino to measure its voltage and a multimeter for checking.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago

Hey I messaged them a bit, this is an undergraduate project with no budget (so a MOSFET tester is out of budget). I also suggested a sort of sweep method using an MCU and some op-amp glue, but I don't think they have sufficient background to get this kind of thing working yet (in fact I barely do, so probably it won't 'just work' with whatever I came up with off the top of my head).

What I was thinking is perhaps they can set Vds and Vgs to fixed values such that a particular MOSFET conducts a fixed current, e.g. 100mA, somewhere near-ish the start of the linear region. Then record the Vgs required to achieve this current for each of a set of MOSFETs, say a few dozen (because of part variation).

Then after exposing them to varying amounts of radiation (a few for each exposure level), put them back in the same test conditions and measure how the output current has changed, what Vgs will restore the same current, draw some graphs, discuss the advantages and disadvantages relative to the Vth method with regards to radiation dosimetry, conclude, and call it a day.

Think it would work? No need for an MCU or signals processing this way, so the science can get done with the tools they have.

Also I never had free access to strong radiations sources in undergrad, so am a little jealous. I barely got to use tritium, and that sparingly.