Predatory publishers or legit? It’s not always easy to spot. What about this one?
Note how the list of institutions is empty!
Dear Dr. Goossens,
Due to your involvement in the field, and the research you published in your paper, “Synchrotron X-ray diffuse scattering from a stable polymorphic material: Terephthalic acid, C8H6O4,” IntechOpen invites you to extend your work and offer a more comprehensive overview of your studies. Contribute a chapter to “Synchrotron Radiation,” an upcoming Open Access book edited by Dr. Name Name.
Work with an internationally recognized peer group and gain increased visibility for your published work. Please visit the book project page to register your interest.
We look forward to hearing from you.
Marija Gojevic-Zrnic, from IntechOpen
Author Service Manager
It’s correctly written, their website looks more or less professional, their books do certainly get printed and distributed — I’ve seen them in proper libraries. They’re open about their (high in my opinion) fees — £1500 for a 16 to 20 page chapter. So are they predatory or just expensive? I guess they’re just expensive.
I still would not publish with them.
Something like IUCrJ (fee of USD1300, or about £1000, so quite a bit cheaper) covers similar topics, is also open access, is part of a society publisher of high standing, has a reliable archiving policy and is embedded in all the major search engines and databases. I have no experience with the editorial process at InTech, and it may be very good, I can’t say, but I do know that the IUCr process is superb, with their editors doing more than just farming out the material to review — they genuinely interrogate it themselves.
The issue? IUCrJ might bounce the work! Also, book chapters are often more review-y and may not be publishable as papers.
But … if you are career-conscious, ask; will the book chapter collect citations anyway? Will it fall through the cracks of whatever research metric engine your bosses want to see quoted? Sadly, this is the reality. Monographs and book chapters might be excellent and important, but will they be noticed? I suggest finding a good-looking chapter or two from an InTech or similar volume and then checking its stats in whatever database you use for citation and impact metrics — Web of Science or whatever. Is it there? Are the numbers reasonable? And so on.
In the end, if you want to go open access, there are reputable journals that will take your money, and £1500 is enough to get into some pretty reputable ones. And conventional publication still exists. InTech might be OK, but check it out first and be aware of the options!
Wrote a crude program to calculate X-ray, neutron and neutron magnetic diffraction scattering factors.
StrucFact or StrucFact.exe is a crude program to calculate expected intensities for neutron diffraction (magnetic and nuclear) and X-ray diffraction (in the version from 2013). It is distributed as source code because any serious user will likely need to modify the source to make sensible use of it. It’s mathematics is essentially taken from Neutron Diffraction by George Bacon and H. M. Rietveld J. Appl. Cryst. (1969) 2 65-71 (see also here), and its mandate is limited; its job is to calculate F2 for neutron scattering for arbitrary cells, magnetic or nuclear, and more recently for X-ray diffraction.
It is `developed’ solely on an `as needed’ basis, which means I add `features’ when I need them to solve some problem I am working on. The inverted commas may seem gratuitous, but they are not!
I am sure there are better tools out there for everything that this program does, and I advise against using it. There should be a README.TXT and the code itself distributed with this file.
1. It does not work for incommensurates (unless you want to define an enormous cell).
2. It treats every cell as P1 (i.e. you have to give it all atoms explicitly).
3. The nuclear and magnetic cells must be the same size, which means that if one is bigger than the other (usually magnetic bigger than nuclear) you have to put the atoms from the smaller cell into the bigger, including inserting all redundant copies of atoms.
4. No corrections for intensities (e.g. not even Lorentz), no B-factors beyond the isotropic.
In other words, it is remarkably limited. Why anyone would want to use it I do not know when FullProf and GSAS and the like are around. Having said that, it is quite simple (in the sense that everything has to be done explicitly, so it is laborious but not as conceptually demanding) compared to such programs, and the (minimally tested) code is here available.
This is Fortran90 code that does not need any extra libraries. gfortran is to be preferred.
gfortran -o StrucFact.exe strucfact.f90
Please report errors in the code to /dev/null, although if desperate you can email me.
Download from http://djg.altervista.org/downloads/StrucFact_Files.tar.gz or http://djg.altervista.org/downloads/StrucFact_Files.zip. There’s a PDF inside the archive that gives more info.
A new scientific paper. A tribute to the residual momentum of my scientific career, and in particular to Eric Chan, who has built on my work to come up with a way of exploring modulated molecular crystals. https://doi.org/10.1107/S1600576717015023
It’s pretty subtle stuff, but basically a crystal structure can show a periodic variation from cell to cell — for example, a displacement or substitution of an atom or molecule. If this variation is periodic, then it can be described using a periodic function. Such a function will have a Fourier transform that requires (relatively) few Fourier terms, and each strong term will give rise to a bright spot in the diffraction pattern. These spots will occur in a motif centred on (some) Bragg peaks, potentially adding many new spots to the diffraction pattern.
So where these bright spots occur tells you about the modulation. However, in something like molecular crystal, the molecular structure factor may be relatively complicated, and so may the nature of the modulation. This may mean that it is not easy to predict where the modulation spots are likely to be intense.
Eric figured out a way to use my program ZMC to generate modulated molecular structures and then calculate their diffraction patterns.
It is pretty heavy and specific stuff, but it also is a capability that I’ve not seen elsewhere. Eric’s webpage is at https://sites.google.com/site/echanj/, and that is the best place to go to have a look for the code.
One of the most-viewed pages here (it’s all relative; still not very many hits really) is the list of dodgy publishers. It’s a drop in the ocean, but it gets a few hits. Something else I’d like to bring all into one place, even though it’s been done elsewhere, is to summarise self-archiving polices. I’ll focus on the journals I’ve published in, preparatory to putting together a web archive of all my papers that I am allowed to self-archive (on my website and maybe on ResearchGate). It’s really just a resource for me, but I might as well make it public.
Some journals are open access; you don’t need to self-archive those, but usually you can.
Some allow you to archive a proper reprint, processed and edited by the journal — like the IUCr, as shown below.
Some suggest you archive the ‘author’s final version’ but don’t want you to put up anything with the journal’s imprimatur.
Some say ‘mine mine mine’ and don’t let you host it at all. I hope to make this clear.
The page lives at https://darrengoossens.wordpress.com/journal-self-archiving-policies, and so far has exactly one (1) entry, the good old IUCr, which has a very enlightened policy. They allow self-archiving as long as you use the official e-reprint, rather than just the downloaded PDF, and they request that you provide a link to the journal. Seems very reasonable. The official e-reprint is easy to recognise; it has a distinctive front page with some official words on it, something like this (colours may vary):
I think their policy is very reasonable because the IUCr has a very professional publication and editorial team who need to be paid and ought to be paid. Subscriptions are part of the mix, yet they allow authors to house their own work and to distribute copies to colleagues freely. It seems a very sensible mix.
This paper and its deposited material explore clustering of 2 × 1 dimers (dominoes) subject to simple interactions and temperature. Much of the work in domino tilings has been statistical, combinatorial and thermodynamic in nature. Instead, here, the domino is used as a simple model of a non-spherical molecule to explore aggregation, rather as if the molecules were interacting in solution. As a result, the work does not look at how many ways there are to tile a plane, but at how the cluster evolves with different parameters in the potential that governs the clustering. These parameters include the rules used to select which of the many possible dominoes will be added to the cluster, and temperature. It is shown that qualitative changes in clustering behaviour occur with temperature, including affects on the shape of the cluster, vacancies and the domain structure.
The paper is on the web, open access, at http://dx.doi.org/10.3390/condmat2020015 and http://www.mdpi.com/2410-3896/2/2/15. It comes with a bundle of software anyone can use to play with the model, modify it, whatever. Please do!
It’s basically a toy model, but it shows some nice behaviour. Apologies to the red/green colour-blind.
In a very recent post, I mentioned an appendix to an article I wrote. I rather like it. The appendix grew out of a little document I put together. That document is longer, vaguer and a little different from the published appendix, and so I am putting it here. Now, the article was written in LaTeX, and this is a website, so I tried running htlatex on the file. It was very complicated:
$ htlatex planes $ firefox planes.html
And it worked. Next thing is to get it into WordPress… Easy enough to cut and paste the HTML code into the window here, but what about all the graphics that were turned into png files?Ah well…bit of manual fiddling. Equations and symbols seem to sit high, and some of the inline equations have been broken into a mix of graphics and characters… still, not too bad. The PDF version is available here.
Planes perpendicular to vectors
Say you have a vector in real space, expressed say in direct lattice terms, for
You may want the reciprocal plane(s) perpendicular to this vector.
Because correlations in a crystal collapse the scattering into features perpendicular to the direction of the correlation. In a normal, fully ordered three dimensions (3D) crystal, this collapsing happens in all three directions, so the scattered intensity coming off the atoms gets concentrated at points, the reciprocal lattice points, usually denoted hkl.
If you have only two dimensional ordering, the scattering is collapsed down in two directions but not the third, giving rise to rods or lines of scattering in reciprocal space (that is, in diffraction space). If there are only one dimensional correlations, the scattering collapses into sheets, that is, it is delocalised in two dimensions and only localised in one dimension (because there are only correlations in one dimension).
In diffuse scattering the crystal is typically long-range ordered in three dimensions, and the diffraction pattern shows nice Bragg peaks (hkl reflections). However, there can also be disorder, for example in the motions of the molecules or the chemical; substitution of one species of atom or molecule for another.
In a molecular crystal, one can sometimes identify a chain of molecules running through the crystal, and interactions within these chains are likely to be much stronger than those within. That tends to mean that the motions of the molecules along the direction of the chain (call that ‘longitudinal’ motion) is highly correlated, while it is not well correlated laterally.
In such a situation, the single crystal diffuse scattering will show ‘sheets’ of scattering perpendicular to the length of the chain.
Then we can say that
and these reciprocal vectors are defined in terms of the direct space vectors like this
and similarly for the other reciprocal vectors. The important thing for us to note is that this means is perpendicular to and . This is important when we go to take dot products later on. The bottom line here is basically the volume of the unit cell, and 2π is just a scalar, so from the point of view of defining the plane that we want, these are not important.
and since we have more variables than we need if we are to satisfy eq. 1, we can arbitrarily set qc⋆ = 0.
and this is useful because, to take the last term on the first line as an example, is perpendicular to ( × ) by the very nature of the cross product. This means that any terms with a repeated vector go to zero. Further, in the remaining terms the vector part is just of the form ⋅ which is the unit cell volume and a constant, which we can also factor out to be left with
which is nice and simple. This is not a surprise but still…
The next step is to find another vector in that plane. This is just , and if we use the same logic but, to make non-collinear with , we choose rb⋆ to be zero, we get an equation analogous to eq. 6. These can be summed up as
Now, in terephthalic acid (TPA), triclinic polymorph of form II, each molecule has a -COOH group at each end. These H-bond strongly with the groups on neighbouring molecules and you get strongly correlated chains of molecules running along the [-111] (direct space) direction. This then suggests that the planes of scattering perpendicular to these chains will extend in the directions
Now, does this work? Figure 1 is some data from TPA, diffuse scattering data measured on a synchrotron. It also shows the reciprocal axes and the white, two-ended arrows show the directions of the diffuse planes and by
counting Bragg spots it can be seen that these agree with the calculation above.
This means that we can ascribe these features to correlations in the displacements of the TPA molecules linked by the -COOH groups.
A Paper! Good God, a Paper: ‘Synchrotron X-ray diffuse scattering from a stable polymorphic material: terephthalic acid, C8H6O4’
I’ve been doing science for a long time, and while I’m in a bit of a career transition at the moment (see here for example), I’ve still got a few fingers in a few pies, and a few pieces of work slowly wending their ways through the system. Most recently, Eric Chan and I put out ‘Synchrotron X-ray diffuse scattering from a stable polymorphic material: terephthalic acid, C8H6O4‘. It’s a paper about the fuzzy, diffuse scattering from two polymorphs of the title compound.
It’s out in Acta Crystallographica Section B: STRUCTURAL SCIENCE, CRYSTAL ENGINEERING AND MATERIALS, a highly reputable but not open access journal, although they do allow authors to self-archive. At the moment, what that means is if you want a copy send me a message and I’ll punt one back to you.
What is terephthalic acid (TPA)? Well, it is a chemical used a lot in industry (plastics and such) and at room temperature it can crystallise out of solution in two forms, called (wait for it) form I and form II. (Well, actually the word ‘form’ is poorly defined in this context, technically, and it’s better to just say ‘polymorph I’ and ‘polymorph II’). In this context, a molecule is polymorphic if it can form more than one crystal structure and these structures can co-exist. Many materials change structure as you heat them up or squash them, but in a polymorphic system separate crystals of the structures can sit there side by side, under the same conditions. In most case, those conditions are room temperature and one atmosphere of pressure.
The two room temperature polymorphs are both triclinic, so of low symmetry. The difference is in how the molecules are arranged relative to each other. In both cases the -COOH groups on the ends of the molecules connect strongly to those on neighbouring molecules, so long chains of molecules form. (In the picture here, the -COOH groups are those at the ends of the molecule consisting of two red (oxygen) atoms, one white (hydrogen) and the grey (carbon) atom attached to the two whites.) These chains are sort of like one dimensional crystals, and then they are stacked up (like logs or a pile of pipes), but you can stack them up with, say, the -COOH in neighbouring chains close together, or you might have the phenyl rings (that is, the hexagon of grey carbon atoms) in one chain adjacent to the -COOH in the next. So in that sort of way you can get different crystal structures depending on how you stack things up.
Anyway, the paper looks at these polymorphs and how they are similar and how they differ. It uses my old ZMC program, which you can download from here (it comes with an example simulation, though not this one I’m talking about now). (That link goes to a paper I wrote and published for an Open Access journal, which I chose specifically so that you could go and download ZMC and everything for free…)
So in doing this I think about the connectivity of the molecule — how do the atoms depend on each other and where does the molecule need to be able to flex and twist? That means I end up drawing diagrams like this one:
That’s exciting, isn’t it? I start at the middle (X) and then each atom is positioned relative to the ones that went before. Here’s another picture (because I happen to have it handy)…. This shows how the atoms were numbered, and how by numbering them correctly and building the molecule up in the right order it is easy to let the -COOH groups spin around.
Here I show typical data. You can see the little white spots — these are the sharp diffraction peaks, Bragg peaks, and they indicate where a lot of X-rays were reflected off the crystal. They are what is used to work out what is usually called the ‘crystal structure’ which consists of the unit cell (the repeating unit) that the crystal is made up from. But you can also see blobs and streaks and stuff, and these are wider (‘diffuse’) features, and these tell us about how the molecules interact and shuffle each other around, and stuff like that.
Anyway, the paper is online now. The DOI link is https://doi.org/10.1107/S2052520616018801. One thing I really like about it is it’s got a mathematical appendix. I always wanted to write an article with a mathematical appendix. I think I might post on that separately.
I feel compelled to make a few comments on the recent changes to the way in which journal publications are to be evaluated in many research organisations and funding bodies.
There is a thing called a SNIP (Source Normalized Impact per Paper). It sounds very plausible, but sadly it is just more nonsense used to berate researchers. For example, it says, “The impact of a single citation is given higher value in subject areas where citations are less likely” — which seems like it makes sense since it is harder to get highly cited in those areas. But maybe some areas have low cites because the citations are not the traditional measure of success.
More importantly for researchers, is the question of granularity. Is it harder to get highly cited in biological crystallography or solid state? Or do you lump them all into a single heading called ‘crystallography’ even though solid state crystallography borders on physics and protein crystallography on biology? Maybe you normalise a journal’s score according to the fields it says it publishes in — opening the way for a journal to ‘tune’ its stated field to maximise its score. Suddenly, we have more options for manipulating the terms of reference to get the result we want. The very fact that the normalisation is attempted adds a whole new layer where graft, misdirection and manipulation can happen. And does. For example…
Here are three journals that cover condensed matter physics. They have the same mandate, effectively, and researchers think of them as part of the same cohort, even if they are distinctly not considered as of equal quality.
- Physical Review B: IF: 3.7 SNIP: 1.204 Ratio IF/SNIP: 3.1
- J. Phys.: Condensed Matter: IF: 2.2 SNIP: 0.901 Ratio IF/SNIP: 2.4
- Physica B: IF: 1.4 SNIP: 0.918 Ratio IF/SNIP: 1.5
So, Physica B gets a SNIP higher than JPCM despite having a much lower impact factor. Why? Because presumably it is being normalised against a different subset of journals. But there is a more insidious reason… Physica B is published by the same publisher that is hosting the SNIP data. No doubt they can completely justify the scores, but the bottom line remains that the SNIP is clearly misleading and more open to manipulation. Physica B‘s SNIP score suggests that a citation in Physica B is about twice as valuable as one in Physical Review B, (because it takes about 3 PRB cites to get a point of SNIP but only 1.5 Physica B cites) which is a complete and utter lie. It should be the other way around, if anything.
It’s all rubbish, but it is dangerous rubbish because I know that people’s careers are being evaluated by reference to numbers like these. People will get fired and hired, though more likely the former, based on numbers like these.
At least a bloody toss of a coin isn’t rigged.
The AANSS is a great mix of formality and informality, quality science in a relaxed atmosphere. Anyone who has or might or ought to use neutron scattering in their work (and isn’t that all of us, really?) is invited. And here’s a trick: Registration is $50 cheaper for ANBUG members but ANBUG membership is free! So join up!
It has long been an intention of mine to take our techniques for exploring the way the atoms are arranged in complicated materials and apply them to superconductors. The crystal structures of the oxide (high-temperature) superconductors are similar to those found in ferroelectric materials, which we have looked at in some detail. The difference is that in ferroelectrics the positions of the atoms relate directly to the interesting properties, since the ferroelectricity arises from atomic displacements (that is, from atoms moving around), whereas in superconductors the useful property shows up in how the electrons behave, and while this must be enabled by the crystal structure, the link is less direct. Even so, it seems to me that if we want to have a good idea of how the properties arise from the structure, then we need to know what the structure is.
One of the high-temperature superconductors is HgBa2CuO4+δ, a classic ‘copper oxide layer’ superconductor, descended from the original high-TC materials discovered in the late 1980s. We found some data on it in the literature, and decided that while the modelling there was a useful place to start, the model that was developed did not really do a great job of mimicking the observed scattering. Hence, we decided to re-analyse their data.
In summary, we find that when the extra oxygen atoms are added to the structure (that’s the ‘+δ’ in the chemical formula), they go into the structure as long strings of atoms, as correctly identified by the authors of the paper with the original data, which is behind a paywall. What we have done that is new is improve the agreement between model and data by adjusting the positions of the surrounding atoms; it makes sense that when you stuff new atoms into a structure, the ones already there have to adjust to accommodate them. Based on things like bond valence sums, we can get some idea of what these adjustments should be, and then create a model crystal in which the atoms are pushed around in sensible ways in response top the added oxygens. These new atomic positions will then influence the environments of other atoms, and of electrons moving through the structure. Here is an image to break up the text:Since the paper is open access, I won’t go into massive detail here, but when it comes to modelling the streaks of scattering in the pattern the results are pretty solid. There are some other, subtle details we continue to work on, but so far I think we can conclude that the methods of Monte Carlo analysis of single crystal diffuse scattering promise to deepen our understanding of superconductors and maybe — maybe! — will help us design ones that work at ever-higher temperatures.