It has long been an intention of mine to take our techniques for exploring the way the atoms are arranged in complicated materials and apply them to superconductors. The crystal structures of the oxide (high-temperature) superconductors are similar to those found in ferroelectric materials, which we have looked at in some detail. The difference is that in ferroelectrics the positions of the atoms relate directly to the interesting properties, since the ferroelectricity arises from atomic displacements (that is, from atoms moving around), whereas in superconductors the useful property shows up in how the electrons behave, and while this must be enabled by the crystal structure, the link is less direct. Even so, it seems to me that if we want to have a good idea of how the properties arise from the structure, then we need to know what the structure is.
One of the high-temperature superconductors is HgBa2CuO4+δ, a classic ‘copper oxide layer’ superconductor, descended from the original high-TC materials discovered in the late 1980s. We found some data on it in the literature, and decided that while the modelling there was a useful place to start, the model that was developed did not really do a great job of mimicking the observed scattering. Hence, we decided to re-analyse their data.
In summary, we find that when the extra oxygen atoms are added to the structure (that’s the ‘+δ’ in the chemical formula), they go into the structure as long strings of atoms, as correctly identified by the authors of the paper with the original data, which is behind a paywall. What we have done that is new is improve the agreement between model and data by adjusting the positions of the surrounding atoms; it makes sense that when you stuff new atoms into a structure, the ones already there have to adjust to accommodate them. Based on things like bond valence sums, we can get some idea of what these adjustments should be, and then create a model crystal in which the atoms are pushed around in sensible ways in response top the added oxygens. These new atomic positions will then influence the environments of other atoms, and of electrons moving through the structure. Here is an image to break up the text:Since the paper is open access, I won’t go into massive detail here, but when it comes to modelling the streaks of scattering in the pattern the results are pretty solid. There are some other, subtle details we continue to work on, but so far I think we can conclude that the methods of Monte Carlo analysis of single crystal diffuse scattering promise to deepen our understanding of superconductors and maybe — maybe! — will help us design ones that work at ever-higher temperatures.
Methuen, 1946. 126 pages.
Methuen’s Monographs on Physical Subjects was a long-running series of slim volumes dealing with a wide range of subjects, from AC power transmission to cosmology. This particular example is the 1946 revision of Worsnop’s 1930 volume. It covers quite fundamental topics, including the properties and generation of X-rays (pre-synchrotron, of course), scattering (Thomson and Compton), refraction, diffraction, spectroscopy (including Auger) and the importance of X-ray studies in supporting the development of quantum theory.
It may seem on the surface that a book from seventy years ago would be of nothing but historical interest. This is in fact not true. The volume gives a very clear account of how an X-ray tube works — and these are still the most common sources of X-rays — and explains how the X-ray spectrum is obtained, with its continuous background and characteristic radiation. It also traces out how X-rays were first characterised, their wavelengths determined, their properties explored in early important experiments. And these both give a sense of the history of the field, but also present some important physics in a very accessible way. Yes, it does in places use the ‘X-unit’ which was not destined to remain part of the field, and refers to ‘centrifugal force’ in a way which I think suggests that the author has not thought clearly about some fundamental aspects of mechanics (or that word usages have changed a little).
These little books show up here and there in jumble sales and book shops, and I’ve accumulated a small subset of them. They are very readable, though pitched at a fairly high level — this is not popular science! — and I continue to pick them up when I see them.
For workers in the field.
Pan 2003, 497 pages.
This is a fascinating book. Sheer detail brings Hooke’s remarkable career into sharp focus.
Inwood is not a prose stylist, I would venture to say. Perhaps it is due to the nature of Hooke’s career — he pursued many themes for a long time — but the text comes to be rather repetitive. List-like. But my interest never flagged because of the subject, because of the pains taken over the research, and because of the enormous significance of Hooke’s work.
Hooke was one of the key figures of the 17th century, at least in England. He left no field of natural philosophy untouched, yes — but was also second only to Wren in shaping the rebuilt London that rose after the great fire. His contributions were perhaps rarely fundamental. He was part of the debate that laid the groundwork for Newton’s Laws, and stated some of Newton’s results before Newton, but from intuition; and without Newton’s impeccable mathematical foundations, his comments were more in the form of opinions in a debate, rather than laws carved in stone.
Why is he so often merely a footnote to the Newton story?
There are several reasons.
One is that Hooke was a professional research scientist — possibly the first in the land. Newton inherited and was gifted enough money to allow him to develop his ideas in a lofty isolation, giving his perfunctory lectures at Cambridge but essentially able to think and dig deep. Hooke was employed by The Royal Society to provide them with demonstrations every week, some titbit to fascinate the dilettantes. One week he was inflating an animal’s lungs or evacuating vessels, the next demonstrating a new pendulum or sextant. He did not have the luxury of time and resources for deep, fundamental study. But I suspect Hooke would have thrived in today’s scientific environment, where entrepreneurship is all the fashion, though would have found many of us far too narrow for his liking.
Related to that was his need to maintain reputation. Hooke was not poor — but he relied on his own efforts for his money. Forty pounds a year for this, fifty for that, a fee for designing a mansion, and so on. This meant that again the need to live got in the way of really grappling with the essence of a field. Further, it explains his irritating and ultimately counter-productive mania about priory of various discoveries. Only by ensuring that everybody knew that he was the mind behind various ideas could he be sure that the employment would continue. This lead him to claim he had achieved things he had not — or to prematurely claim achievements that never came to fruition, or to play odd games like using a code to present results he wanted to claim as his own but was not yet ready to reveal. The end result was a great deal of scepticism toward his every word from certain figures, in particular partisans of other great figures of the time like Newton and Huygens.
But I suspect it was in his nature of flit from topic to topic. His was a restless energy. He did fundamental work in chemistry — where he was Boyle’s right hand man — and made some statements that presage the ideal gas law; and in physics, where he invented early vacuum pumps, made important strides in time-keeping (work which lead to his most persistent memorial — Hooke’s Law of the force due to the extension of a spring), in astronomy and in optics. In biology he did early work on the nature of respiration and published Micrographia, one of the most important texts of its time and a key work in the history of microscopy and biology. He coined the term ‘cell’ in biology, by analogy with a monk’s cell, when he was looking at the structures of cork under one of his own microscopes. In my own field of crystallography he proposed the idea that crystals were made of stacked identical building blocks, and that this explained the regular facets. Typically, this is rarely mentioned in crystallography texts.
Another reason for Hooke’s lower fame is, I suspect, that no portraits of him remain. No little marginal bio with a photo appears in a history or text book. It adds up.
Yet he was in some ways the most modern of all the figures of his time; he was a scientist by career rather than as a gentlemanly pursuit, and a firm believer in the primacy of reason and evidence. Newton explored alchemy and magic, and has aptly been described as an early scientist and a late sorcerer. Hooke saw petrified shells high up in the mountains and, rather than convince himself they were ‘figured stones’ (what? decoys buried by God?), insisted that they had once been in the sea and the sea bed must have risen, and if that meant that the world was older than the bible indicated then… so be it. He found the conclusions difficult to stomach, but he did not bury his head in the sand, unlike so many around him. And he came to these ideas a century before Hutton came on the scene and two before Lyell. But, typically, he did not bury himself in the work, but threw off ideas, argued in their favour, and moved on. Part of the greatness of Darwin is that he buttressed his theory and made it impossible to ignore. Similarly, Newton underpinned his ideas about gravitation — most of which had been quoted previously by someone else, Hooke included — by a unifying mathematical treatment that made them more than a matter for debate. It is remarkable how often figures we venerate for their originality in fact were not as original as we think, but more rigorous. We should not underestimate the importance of this! We all tend to cling onto old ideas as long as we can. They are comfortable, familiar, accepted. To displace them takes fortitude and thoroughness. Especially in earlier times, when religion retained its grip.
He also invented the universal joint.
This book is essential reading for anyone interested in the history of science, or in Newton or the 17th century. It offers lessons on the parlousness of reputation and legacy, and is testament to Inwood’s inkling that there was a story here to be told. Even the workmanlike nature of the prose, which I began by criticising, seems like the only language suitable for the topic; forthright, truthful and putting content above form.
Intel 6th-Gen i7 6700K SSD DDR4 4.0GHz CPU, 16GB DDR4 RAM, 2TB SATA III 6GB/s HDD,N600 Wireless Dual Band PCI-Express Network Adapter with 2 Antennae. (Just a cut and paste from the specs.)
Ordered it from D&D Computer Technology Pty Ltd, and delivery was pretty quick. At my work the standard Linux ‘solution’ is RHEL, so it is running RHEL 6.7 (the IT guys here don’t like 7 — it uses the controversial systemd, for one thing…)
Wireless internet so I can put it wherever I want to.
Compared to our previous generation of boxen (4+ years old), it runs a fairly typical Monte Carlo simulation in 20m55s instead of 27m21s, which is a useful but not massive improvement, which is really the result of code that is really just a single, single-threaded process which results in it scaling more with the clock speed than anything else.
I’ve put LaTeX on the box, but I am going to manage it via TeXLive’s tlmgr rather than RHEL’s package management, so we’ll see how that works out…
Here begins a technicalish, science-y post.
This post is all about a paper we recently published in IUCrJ, here is the link: http://dx.doi.org/10.1107/S2052252515018722.
When X-rays or neutrons scatter off a sample of crystalline powder, the result is a powder diffraction pattern. Usually the intensity of the scatting is measured as a function of the angle of scattering for radiation of a fixed wavelength. The angle can be converted to the more universal ‘scattering vector’:
Now, when analysing a pattern like this, the most common method is Rietveld refinement, in which a possible unit cell is posited, and its diffraction pattern calculated and compared to the observed.
Now, this is very useful indeed, but there are a couple of issues. The first is that this sort of analysis only uses the strong Bragg reflections in the pattern — the big sharp peaks. Mathematically, this means it finds the single body average which is to say that it can show what is going on on each atomic site but not how one site relates to another. For example, it might say that a site has a 50% chance of having an atom of type A on it and 50% of type B, but it can’t say how this influences a neighbouring site. Do A atoms cluster? Do they like to stay apart? This information, if we can get it, tells of the short-range order (SRO) in a crystalline material, where the Bragg peaks tell of the long-range order. SRO is important, interesting, and rather difficult to get a handle on.
Now, the flat, broad (‘diffuse‘) scattering between the Bragg peaks — stuff that looks rather like background, and is often mixed up with background — contains two body information. If the non-sample scattering is carefully removed, then what is left is all the scattering from the sample, and only scattering from the sample. This is called the Total Scattering. This can then be analysed to try to understand what it going on. The most common way of doing that is to calculate the pair distribution function (PDF) from the TS. This essentially shows the probabilities of finding scatterers at different separations — a two-body probability, which helps us ‘get inside’ the average structure that we get from Bragg peak (Rietveld) analysis.
Now, this is all talking about powders. The main issue is that a powder is a collection of randomly oriented crystallites/grains which means the pattern is averaged. Ideally, it would be nice to have a single crystal, to measure the total scattering in a way that is not averaged by random orientation. This is Single Crystal Diffuse Scattering, SCDS. It is (in my opinion) rather a gold standard in structural studies, but is pretty tricky to do…
What the paper we have just published in IUCrJ does is to take a system we have studied using SCDS, and then study it using PDF to show what things the PDF can reasonably be expected to reveal and what features are hidden from it (but apparent in the SCDS). We did this because we felt that PDF, powerful as it is, was perhaps being over-interpreted, and treated as more definitive than it is, and in many cases it is the only viable technique, so it is hard to gauge when it is being over-interpreted. Hence we look at in for a case when it is not the only available method.
What we found was that PDF is very good for showing the magnitudes of the spacings between atoms, and for showing the population of the spacings between atoms, but is not good for showing how these spacings might be correlated (ie, are the closely spaced atoms clustering together?). Similarly, it was not good at showing up the ordering of atoms (…ABABA… vs …AAABBBB… for example).
The PDF is in real space — it is a plot of probability against separation, separation is measured in metres, like distances are measured in the world we experience. The SCDS and the TS exist in reciprocal space, where distances are measured in inverse metres (m-1). Some atomic orderings give rise to features that are highly localised in reciprocal space, so are best explored in that space. Also, if the ordering in question only affects a small section of reciprocal space, and that is getting smeared out by the powder averaging, then it won’t show up very well in TS or then in PDF.
For example, above is a cut of SCDS calculated from an analysis of the PDF, whereas below is our model for the SCDS. Clearly the latter should be a lot better — and it is. No surprise. Now this is not making the PDF fight with one hand tied behind its back, and not setting up a straw man, either. The point it not to show that SCDS is a more definitive measurement, the point is to show what PDF can be expected to tell us, so that when we are studying the many systems that we cannot do with SCDS because we cannot get a single crystal, we know when we are stretching the data too far.
Advances in Condensed Matter Physics is an open access journal. I have published in OA journals before, but I am pretty selective about them. For example IUCrJ is is published by the IUCr, and I strongly believe that their name is a ‘guarantee of quality’ as good as any. The other open access paper was in ISRN Materials Science, which was more of an unknown quantity for me, and being a publisher who seems to have expanded via the OA model, the idea of sending work there initially filled me with skepticism. This rather abated when it turned out they wanted a commissioned article and they were even prepared to pay me for it. I am quite happy with how it turned out. The picture below is not that paper but the new one.
Recently, I got an email from Advances in Condensed Matter Physics asking me for an article. Now, I delete several of these sorts of emails a day, and was about to delete this one when I thought, well, perhaps I should give the publisher — Hindawi, publishers of ISRN Materials Science — a second look. The fact that they were prepared to commission something from me showed (of course) remarkable good taste, and the fact that they were prepared to pay for it (and did) suggested that they have an intention of climbing out of the ruck and mall of crappy OA publishers.
So I reread the email. It appeared that they were prepared to waive the fee. Good start. I then went to the ACMP website and searched for the names and work of some scientists that I respect. And they had indeed published there. The journal has a proper impact factor, even if not especially high, and is not just indexed on google scholar (ie, not really indexed at all) but in proper databases like Web of Science. Okay, so it seems like a real journal.
There was an article I wanted to write, one of a very specific kind. I have been developing and using the ZMC software for modelling diffuse scattering from molecular crystals — looking at subtle orderings in materials to improve understanding of structure and function. The process is non-trivial, and not easy for the novice to get a grip on. So what I wanted to do was create an example of a simulation of a crystal, write a paper about it, then upload a bundle of files (as ‘additional material’) that would let a user recreate the simulation.
This seemed like an ideal opportunity, because the software would then be available and not hidden behind a paywall. Indeed, that is how it has worked out — the simulation (‘supplementary material’) can be downloaded by anyone by going to this page.
So not all OA publishers are alike, but it is important that a prospective author does their research into the journal and makes sure they are credible.
Here is my recipe for making little videos with a document camera. Please note that the recipe is specific to the make and model discussed (Lumens DC192) and set up particularly for videoing a human hand scribbling down a solution while speaking about it at the time. This is not flashy stuff, just material for a moodle page for a Physics course.
So the image on the video looks like this:
Well, it should be, but there can be a lot of trickinesses. Anyway, here is my recipe. I do not expect anybody to follow it closely (or at all), but the steps and the things I have to think about may be useful for some if they need to make little videos, often in relation to ‘flipping‘ a Physics course, which is very much the fashion these days.
How I made a very simple video using the Lumens DC192 Document camera
- Lumens DC192 document camera.
- USB memory stick.
- VGA cable.
- Pens, paper, your brain, your script, etc.
- Then computer and software for editing.
If using the camera on a desk rather than in lecture theatre, simplest is to use VGA cable from camera to monitor and that is all you need.
(1) Got a 4GB+ USB stick and deleted all files. Not necessary to empty the stick but this minimises chance of running out of space during a video capture.
(2) Picked a problem or two. Made sure I had good worked out solutions, and printed them out/wrote them up on paper – see image below (figure 2).
(3) Got some overhead markers. I find they give a dark line but are not too coarse. Staedtler ‘Lumocolor’ type pens with a fine tip were good.
(4) Printed out an A4 sheet with the problem occupying the top left of the page, landscape. Thought about whether more than the remaining white space was necessary, and had extra sheets ready. See figure 3. A few extra copies were useful too… made sure it was printed pretty dark and pretty big. (For scanned images I used ImageJ to Gaussian blur, then Brightness/Contrast to darken and turn the blur into thicker lines.)
(5) Got my sheet with the already-worked-out solution (figure 2) and marked a few key words on it for concepts I want to make sure I covered, things like ‘conservation of energy’ or ‘Newton’s laws’ or whatever.
(6) Also noted on it what colours I want to use where. I have a few pens — black, red, blue green and brown. No yellow (not dark enough on the white paper).
(7) Got remote control for DC192.
(8) Killed any sources of noise as I could – climate control, for example. Put ‘Do Not Disturb’ on my door. Shut all doors, turned on all lights. Used an extra lamp to avoid shadows on the page.
(9) Taped down an A3 sheet such that it filled the screen. Then put my A4 with printed problem on top of that, taped down with magic tape.
(10) Removed jangly things from my pockets — keys, for example. And coins. Do you write notes on the backs of your hands? Best wash them off. I didn’t…
(11) Inserted a USB stick into the DC. Screen said ‘copy to USB stick’ (it may not actually ask you this, depending on camera settings). Selected ‘no’ by pressing ‘Enter’ button on the DC. (Answer ‘yes’ if you want to copy the contents of the DC’s internal memory.)
(12) Adjusted the camera neck and zoom such that only my A4 page was visible. In fact, if the camera was positioned right it could only see the white A4 page. Made some small marks on the paper to indicate where the edges of the camera window were, so I would not write stuff off the edge.
May be good to use the remote to adjust the microphone level to about a quarter to reduce saturation and resulting lousy sound. This setting is remembered when the unit it on stand-by but should be checked if it has been turned off at the wall.
(13) Arranged my notes and pens ready for use while talking. Tested all pens, and placed so that they would not roll into camera shot.
(14) Reviewed my script. Thought about: How much intro? Dive straight in? Talk about problem solving skills in general or just do this particular problem? Talk about units, orders of magnitude, sanity checks?
(15) Pressed ‘record’ on the remote and got started. A picture of a camera appeared on the monitor to indicate recording was going on. (Note that the files will just have names like LUMN0001.AVI, so you either want to have a visual queue early in the video to identify the video or make a note as you go on a notepad or something — if you are making a bunch of them at a time, anyway.) I made sure I renamed the files to something meaningful when copying the files across to my desktop machine. Also, note that the files are AVI files, which are a non-compressed video format and therefore they’ll be much bigger than a final, produced mpg/mp4 type file — and may fill up the USB faster than you expect.
Some comments on recording:
- Don’t be afraid to pause while filming. They can be cut out in Camtasia. I made sure I just paused then continued, if necessary repeating myself a little, rather than continually stopping and retaking the video.
- I made some minor crossings out, and think that’s OK. Students don’t mind ‘warts and all’, but it should not interfere with clarity.
- I found that the microphone is pretty sensitive, so I can afford to talk normally or even more quietly. It is easy to cause distortion in the sound. As noted above, low microphone levels can be set using the menu accessed by the remote control or by the menu key on the unit, and you may well find it useful to set this low – about 1/4 seems good. Do this before starting to record might be good idea.
(16) Hit the record button to turn it off. Done! I kept the A4 sheet with the working out so I could scan and upload it along with the video.
(17) Turned off, clean up after myself, not forgetting the USB stick.
(18) Fired up Camtasia (or similar).
(19) This is NOT a Camtasia tutorial. I did this, though: Opened a new project, imported the video, normalised the audio and did noise reduction (I use default values), then edited out my pauses and idiotic remarks and produced to 480p video without SmartPlayer (‘MP4 only (up to 480p)’). Other useful things include speeding up sections (places with lots of algebra I just shut up and wrote, then later sped up by factor of four) and using callouts to highlight things. I tried to keep that to a minimum.
In more detail:
(a) Opened Camtasia
(b) File -> ‘Save project as’ … gave it a name.
(c) File -> Import Media -> selected the AVI file. I created a ‘Videos’ folder somewhere, created the camproj file in there and put the AVI files in some subfolder.
(d) Right clicked on the video thumbnail and ‘Add to timeline at playhead’
(e) Audio tab, then checked ‘Enable volume levelling’ and ‘Enable noise removal’
(f) Started cutting bits off the video, leaving the good/least worst bits.
(g) Produce it: File -> Produce and Share -> Whatever format (MP4 only (up to 480p)) -> and done! Files are not that big (15MB) in this format.
That’s the procedure, such as it is.
Well, I get several email a week from dodgy journals asking me to pay for them to publish my science. While there are good places to go and check if they are predatory (here, most obviously), I thought I would accumulate my own little list. It is here: https://darrengoossens.wordpress.com/science-spammers/ and is just a list, sometimes with links, of the journals/publishers who have asked me for material and who are either clearly rubbish or who may or may not be rubbish but are still spammers and therefore should (perhaps) not receive support from authors. So they may not be rubbish, but they are spammers at the least.
I am particularly keen for younger, less experienced authors to avoid these traps. A great many open access journals are very good — I have published in them myself. A greater many, sadly, are predatory, running shonky websites and asking authors for dollars in return. Often the members of their supposed editorial boards either do not exist or do not know they are on the board or joined without knowing what they were getting in to, and then have asked to be removed and have not been.
It is a bit of a mess, and I would hate to see a young researcher ‘burn’ their good work by putting it in one of these journals.
My little list is just a data point, a personal experience of the flood of solicitations by dodgy publishers.
Following getting it to work on Windows, I wanted to install G95 as well as GFortran to test some code. I’m using a Linux 32-bit Virtual Machine as detailed here. I don’t know why but following the instructions did not work for me. This is what I did, using the 32-bit x86 binary on a VirtualBox VM:
Opened a terminal then typed all this at the command line (errors edited out):
mkdir installs cd installs wget http://ftp.g95.org/g95-x86-linux.tgz tar x -vzf g95-x86-linux.tgz cd g95-install/ ln -s bin/i686-pc-linux-gnu-g95 ~/bin/g95 DID NOT WORK rm ~/bin/g95 ln /home/username/installs/g95-install/bin/i686-pc-linux-gnu-g95 /home/username/bin/g95 PATH=/home/username/installs/g95-install/lib/gcc-lib/i686-pc-linux-gnu/4.0.3:$PATH export PATH LIBRARY_PATH=/home/username/installs/g95-install/lib/gcc-lib/i686-pc-linux-gnu/4.0.3 export LIBRARY_PATH
Then I could compile. I had to add the big ‘/home/username/installs/g95-install/lib/gcc-lib/i686-pc-linux-gnu/4.0.3‘ entry to both PATH and LIBRARY_PATH for it to work. I don’t think I should have to, but I did have to.
Note that /home/username/bin is also in the PATH.