So I was in the lab for about 10 hours, decided to go home, but there was a run left to do, an experiment. This consists of one of our substrates going through a particular process program on a sputtering machine, specifically, to put a coating down on the substrate of silicon dioxide (basically sand) and a certain thickness, about 1/2 micron.
So I gave instructions to the guy we asked to do the job, use program 9, which goes through the steps automatically.
He screwed up and ran program 6, which is a SiO2 PLUS Silicon Carbide layer.
The funny thing is the machine has a vacuum system based on the cryopump, where it literally freezes gasses out of the vacuum, thereby making the vacuum much better.
There is a procedure called 'cryo regen' which heats up the cryo when it gets too loaded with the gasses being pumped out and they get pumped away by a mechanical pump.
What happened was, some kind of glitch happened where the cryo went into cryo regen by itself, after the SiO2 layer was finished but before it could go into the Silicon Carbide coating step, so it gave an error message and did not run the SiC, saving the experiment!
So in that case, 2 wrongs definitely made a right!
Originally posted by sonhouseA friend of mine used to work for HP making silicon wafers. Is that what you do?
So I was in the lab for about 10 hours, decided to go home, but there was a run left to do, an experiment. This consists of one of our substrates going through a particular process program on a sputtering machine, specifically, to put a coating down on the substrate of silicon dioxide (basically sand) and a certain thickness, about 1/2 micron.
So I gave ...[text shortened]... not run the SiC, saving the experiment!
So in that case, 2 wrongs definitely made a right!
The post that was quoted here has been removedThe making of silicon wafers consists of having a crystal puller that takes a seed crystal of silicon and in a vat of liquid silicon, pulls and twists up very slowly to build up the silicon boule, 25 years ago those boules were maybe 1 foot long and 4 inches in diameter but now they have advanced that technique so much they can be 3 feet long and 20 inches in diameter which means we can have bare wafers 450 mm in diameter, as big as a pizza pan, which in turn means many thousands of computer chips on one wafer. Then they slice and dice 'em with special saws into wafers about one mm thick, polish them and such, and the way they are grown there are 3 crystal face orientations the surface can be in so once they get the right orientation crystal, cut and polished, they send them on their way to the Texas Instruments and Intel's out there.
THOSE guys take those wafers and process them further with etch patterns, ions implanted at specific depth that imparts conductivity on the previously non conducting wafers and do all the patterning and such which ends up as a CPU or memory chip or whatever.
In our business we don't use silicon, we use alumina substrates and then have similar processes that ends up with gold and aluminum wires and microscopic sized resistors in an assembly about 1/2 inch wide and 3 to 5 inches long that is the working piece in a thermal printer. They only last about a year because the paper is abrasive (the kind you see in gas station receipts and grocery receipts).
We are researching a newer technology for those print heads called thin film heads which are manufactured more like silicon chips than the old style thick film heads which are manufactured with technology more akin to screen printing artists use for their works.
Our thin film heads use much much less material and are therefore cheaper to produce, assuming we can get a viable production process going. We are in the end stages of that process now. Close to viability. So that's what we do.
The goof ups I mentioned are part of that process.
Originally posted by sonhouseThe actual making of the crystal wasnt what I meant. . He offered me a wafer to take home. It was defective but it looked fantastic. It was interesting to look at it with a magnifying scope. Hard to believe such tiny ckts can possibly work. Actually I work with two people that worked there. They would speak of vacuum and getting the out gassing of the equipment removed. Good luck with the new process.
The making of silicon wafers consists of having a crystal puller that takes a seed crystal of silicon and in a vat of liquid silicon, pulls and twists up very slowly to build up the silicon boule, 25 years ago those boules were maybe 1 foot long and 4 inches in diameter but now they have advanced that technique so much they can be 3 feet long and 20 inches ...[text shortened]... e to viability. So that's what we do.
The goof ups I mentioned are part of that process.
The post that was quoted here has been removedIf there were circuits on that wafer, btw, what size was it?, that meant it didn't come directly from the wafer manufacturer but from the semiconductor process plant that makes memory chips or cpu's or other stuff having to do with computers. The same kind of processes can be used to make analog stuff too, like RF amplifiers, RF generators, lasers (which can be analog or digital) and the like. We have progressed from having 4 components on a single chip (the original) to what we have now, with literally billions of chips and in a few years, trillions. You can't see anything but diffraction patterns and the roughest thickest wiring, the actual transistors, FET's and such are way to small to even be seen in the best optical microscope. Well maybe not the best of the best, but an ordinary college physics microscope, forget seeing the actual individual transistors.
Originally posted by sonhouseI think they were about 4 inches in diameter. We looked at it through a magnifiying scope and saw the image on a computer screen. With the naked eye it just looked like fuzzy grey squares. I am not sure we were seeing all the mini components but there was definitely a pattern we could see. I don't know if it was for computer memory or op amps or what but it was very fascinating. I wish I would have gotten one. He had his actually mounted on a display board to hang on the wall. I know you are used to this technology but at time doesn't it amaze you what can be accomplished. Makes one wonder how it can even work.
If there were circuits on that wafer, btw, what size was it?, that meant it didn't come directly from the wafer manufacturer but from the semiconductor process plant that makes memory chips or cpu's or other stuff having to do with computers. The same kind of processes can be used to make analog stuff too, like RF amplifiers, RF generators, lasers (which ca ...[text shortened]... but an ordinary college physics microscope, forget seeing the actual individual transistors.
The post that was quoted here has been removedIt still amazes me for sure. I remember when I first started in the semiconductor industry in 1980, a place called General Instruments in Chandler Arizona, the feature size back then was creeping up on 1.5 microns and all the buzz was how we were going to get to ONE micron. Man, now we are talking 40 times smaller. That 4 inch wafer you saw, depending on the year it was produced, undoubtedly had features your little microscope could not discern, and the stuff you did see was interconnect lines, some of the bigger ones for power. If each block you looked at was about one inch by one inch or so, it was probably a CPU or memory chip wafer. At one company I worked for, RCA in Mountain top Pa (bought out at least 3 times by now), they produced power FET's where a hundred of them or more were tied in parallel so the power was spread out over all those transistors so the final power capability was something over 100 watts, maybe even 200 watts for some.
But the entire assembly was not much bigger than a pinhead and all you could see with the naked eye was a bunch of squares less than 1/10th inch on a side but with literally hundreds of them on one wafer. Each one of those 1/10th inch squares had hundreds of individual transistors though.
Now there are literally billions of them on one chip and they are looking to the day they can squeeze a trillion of then on one chip a Cm or two on a side. That will make what we consider now to be in the realm of supercomputers, hundreds of times more powerful than all the multi core CPU's we have now.
One major development they are looking at seriously is word length in the thousands rather than the 64 bits in common use today. That would allow dozens of 64 bit instructions to be processed simultaneously, all in one CPU. You can imagine how powerful that would be as a system.
One other big step in speed has already been developed: Hard Drives with built in solid state flash memories, so a triple layer of memory that is ten times faster than the hard drive alone. Apple already is using that technology on their latest computers just coming out.
Originally posted by sonhouseVery interesting stuff. I bought an apple computer about a year ago and did not take the flash hard drive option. I wonder if that was what you were speaking of. Most of the computers we have on our equipment at work use flash memory hard drives as there is so much vibration. I don't believe they are the ones like you speak of here. It is awesome that the computer technology has not hit a brick wall yet. In your line of work do you hear any word on artificial intelligence?
It still amazes me for sure. I remember when I first started in the semiconductor industry in 1980, a place called General Instruments in Chandler Arizona, the feature size back then was creeping up on 1.5 microns and all the buzz was how we were going to get to ONE micron. Man, now we are talking 40 times smaller. That 4 inch wafer you saw, depending on th ...[text shortened]... drive alone. Apple already is using that technology on their latest computers just coming out.
The post that was quoted here has been removedThe drive I am talking about melds the old rotating magnetic platen HD technology with solid state flash drives which are much faster reading and writing but have a more limited total memory compared to regular hard drives.
What that buys you is intelligent searching puts up stuff the computer/hd thinks you will need based on past reads or writes and has the stuff come out of the slower hard drive to the solid state drive and then to the system, a much faster combo that combines the massive memory of the newest HD's, into the terabytes now and the speed of solid state drives which are limited to maybe 256 gigs max at least so far.
They need to improve on the rewrite cycles the solid state drives have, right now they poop out after about 100,000 re writes which for a home computer will last for a long time. And they need to up the density to match the biggest HD's, now around 3 or 4 terabytes in order to eliminate the physical HD. I suspect this hybrid technology will have only a few years lifespan till they get solid state drives up to the cycle time and capacity of rotating HD's. After that, the rotating HD will go the way of the surrey with the fringe on top.
Originally posted by sonhouseInteresting! The rotating hard drives are not suitable for what we need here. What is the power consumption of a flash drive compared with a rotating drive? I would think the motor and the pickup arm would use much more than a flash drive.
The drive I am talking about melds the old rotating magnetic platen HD technology with solid state flash drives which are much faster reading and writing but have a more limited total memory compared to regular hard drives.
What that buys you is intelligent searching puts up stuff the computer/hd thinks you will need based on past reads or writes and has ...[text shortened]... ating HD's. After that, the rotating HD will go the way of the surrey with the fringe on top.
The post that was quoted here has been removedI have thought for the last few years that the hard drives are on their way out. Not only is there the issue of power draw, etc., but the wear and tear on hard drives due to mechanical movement is another negative. Also consider the slower access time. I think something akin to the flash drive concept will overtake the hard drive market in very few years. A further take on this idea, something I considered years ago, not even considering the computer use aspects, was something I thought might overtake and replace movies on DVD (or BR, same rotating concept), movies stored on a variation of flash drive (simply considered the equivalent of extended multiple flash drive content as stored in a cubic form). Getting away from actual mechanical moving parts is where things are headed, IMHO.
Originally posted by sonhouseYeah, 4.7GB on a standard (single layer) DVD, storing 2 hrs of SD movie content. From what I can gather, an HD 2 hr movie requires 25GB (single layer BR).
How many gigs does it take to store a 2 hour movie in HD? Not sure myself. I think you can get a couple hours of regular res on a DVD but how much more data is there for the same movie in HD? So a regular single layer DVD stores about 4.7 gigs I think.
The post that was quoted here has been removed25 gigs per HD movie. 40 movies per terabyte. Wow, not much movie for the buck there. I wonder if that is the best they can get compression wise? Look at the difference between CD at 11 megs per minute at its rate and mp3 sounding pretty close but taking up 1/10th the space.