Originally posted by @sonhouse
What do they mean by information converted to work anyway?
That is gobbledegook, the only thing that can be converted to work is energy. From what I can tell from the article they've got a particle in a trap which moves around randomly due to thermal fluctuations and when it shifts in the direction they want it moved they move the trap. I assume that moving the trap has no associated energy cost. I think I'd need to look at the original article to make any further comment. But as far as I can make out this is a variant of Maxwell's demon, and the article writers are confounding information with entropy.
I've got to say I'm never quite sure what people mean when they talk about information in this kind of context. Shannon invented his informational entropy in the context of cryptography and it refers to the number of bits of information in a string of symbols. Our description of physical states involves quantum numbers and there's clearly a similarity, but I feel that the description is being confounded with the thing in some way in these theories. I don't think information is a property of things, we have information about the things, but that's just the results of measurements and so forth, and not an intrinsic property of the thing. I can't help thinking that there's some sort of fallacy with treating information as a property of nature and not a property of our description of nature.