Archive | ramblings RSS for this section

World ‘World Day’ Day — you know it makes sense.

Every day seems to be World Something Day. Today is World Meteorological Day. Tomorrow will be something else, and the day after something else again. And those are just the UN sanctioned ones.

I think we need to have a day to honour all the people that get together and organise topic-based days. It takes a lot of coordination and determination to make sure that everything finds its right place. You don’t want World Dog Day and World Cat Day occurring at once, or you’ll just have lots of trouble in lots of parks around the world. You probably don’t want World Chocolate Day and World Diabetes Day to coincide. Or International Women’s Collaboration Brew Day to clash with Alcohol Awareness Month.

So, when should we schedule World World Day Day, a day organised in honour of all those people who spend their time organising days in honour of people? Somewhere near Administrative Professionals’ Day, perhaps.

Organising an event like that would be a good use of someone’s time. Maybe we could have a day to celebrate them.

 

Fleas and so on.

 

Word madness: Can’t save, won’t save. ‘A file error has occurred’

Word's useless error message. Notice the 'Was this information helpful'. What do you think?

Word’s useless error message. Notice the ‘Was this information helpful’. What do you think?

 

Got this error, and they had the temerity to ask me if it was helpful. Pricks. Anyway. Could not save to new name. Could not save to external media. Could not save elsewhere on C:. In short, could not save.

No.

No.

One bit of advice I have read is to wait till Word does an autosave, then kill Word using task manager. Then when Word is restarted it will give an option to rescue the file. Sounds dangerous to me. Waited but save did not come.

First thing I did was print to PDF with all track changes and everything visible so I would at least have a record of what the file looked like.

Then created a new blank file. Tested that it could be saved. Yes. And in the same folder as the original file. (I knew that should be OK since I printed to PDF into the same folder).

Went to file I wanted to rescue, with track changes visible and all comments visible. Ctrl-A, Ctrl-C
Went to new empty doc and pasted. Got text and comments but not the track changes information. Well, that is still useful as a backup.

Save.

Now, it should be possible to make a copy with track changes information.

https://word.tips.net/T001783_Pasting_Text_with_Track_Changes.html

Another handy way to copy the text is to use the spike. Word users are so familiar with using the Clipboard to cut, copy, and paste information that we often forget about the spike. This is an area of Word that acts like a secondary Clipboard, with some significant differences. (You can learn more about the spike in other issues of WordTips or in Word’s online Help.) To use the spike to copy and paste text with Track Changes markings intact, follow these steps:

  1. In the source document, select the text you want to copy.
  2. Press Ctrl+F3. The text is cut from the document and placed on the spike. (If you wanted to copy, not cut, then immediately press Ctrl+Z to undo the cut. The selected text still remains on the spike.)
  3. In the target document, place the insertion point where you want the text inserted.
  4. Make sure that Track Changes is turned off in the target document.
  5. Press Shift+Ctrl+F3 to clear the spike and insert the spike’s text into your document.

So I went to source document ant hit Ctrl-A, then Ctrl-F3.

Opened blank with same template, track changes turned off (it is by default I think).

Shift-Ctrl-F3

But does not save! The problems have come with it!

So that does not help.

Now, if I turn off track changes and accept all changes, I can save the document – so it is a bug somewhere in Word’s track changes code.
If the problem occurs again, can try the spike method with the different aspects of track changes turned on and off, to narrow it down.

So no satisfactory solution discovered. I do not know what change I put in that caused the issue, and it has never occurred before. So… I dunno. The above ideas are just partial solutions.

 

Solutions to problems nobody asks about.

Horrible errors on my Linux box.

Horrible errors on my Linux box. When trying to boot up. I mean, eventually it did boot, but this was not good:

Feb 27 15:17:00 lauequad kernel: [21057.921922] ata2.00: cmd 25/00:08:00:08:c3/00:00:16:00:00/e0 tag 0 dma 4096 in
Feb 27 15:17:00 lauequad kernel: [21057.921923]          res 40/00:00:00:4f:c2/00:00:00:00:00/00 Emask 0x14 (ATA bus error)
Feb 27 15:17:00 lauequad kernel: [21057.921924] ata2.00: status: { DRDY }
Feb 27 15:17:00 lauequad kernel: [21057.921932] ata2.00: hard resetting link
Feb 27 15:17:01 lauequad kernel: [21058.643829] ata2.01: hard resetting link
Feb 27 15:17:01 lauequad /USR/SBIN/CRON[6290]: (root) CMD (   cd / && run-parts --report /etc/cron.hourly)
Feb 27 15:17:01 lauequad kernel: [21059.118482] ata2.00: SATA link up 1.5 Gbps (SStatus 113 SControl 310)
Feb 27 15:17:01 lauequad kernel: [21059.118493] ata2.01: SATA link down (SStatus 0 SControl 300)
Feb 27 15:17:01 lauequad kernel: [21059.226065] ata2.00: configured for UDMA/33
Feb 27 15:17:01 lauequad kernel: [21059.250462] ata2.00: device reported invalid CHS sector 0
Feb 27 15:17:01 lauequad kernel: [21059.250466] ata2: EH complete
Feb 27 15:17:32 lauequad kernel: [21089.830651] ata2: lost interrupt (Status 0x50)
Feb 27 15:17:32 lauequad kernel: [21089.830669] ata2.00: exception Emask 0x52 SAct 0x0 SErr 0x58d0c02 action 0xe frozen
Feb 27 15:17:32 lauequad kernel: [21089.830672] ata2.00: SError: { RecovComm Proto HostInt PHYRdyChg CommWake 10B8B LinkSeq TrStaTrns DevExch }
Feb 27 15:17:32 lauequad kernel: [21089.830674] ata2.00: failed command: READ DMA EXT
Feb 27 15:17:32 lauequad kernel: [21089.830677] ata2.00: cmd 25/00:08:00:08:c3/00:00:16:00:00/e0 tag 0 dma 4096 in
Feb 27 15:17:32 lauequad kernel: [21089.830678]          res 40/00:00:00:4f:c2/00:00:00:00:00/00 Emask 0x56 (ATA bus error)
Feb 27 15:17:32 lauequad kernel: [21089.830679] ata2.00: status: { DRDY }

Except they were all nicely coloured and arranged by vim’s syntax highlighting.

/var/log/syslog

This was in the file /var/log/syslog

Freakin’ scary. I thought one of my hard drives was on the way out. Fortunately it’s not my main drive, the one that houses / and /home, but it’s the second drive which mounts at /home/username/Music.

So, I thought maybe the drive was on the way out. My back ups were up to date, but I noticed that when I want to have a look in ~/Music, there were some files that were corrupt. On boot the messages included one telling me to run fsck on /dev/sdb1 (the Music partition) and then dropping me into a shell, and then fsck told me it could not fix the drive…

Hmm…

Double-checked my backups were current, then unmounted the partition and used gparted to reformat it freshly as ext4. Started to copy the files across from the backup.

Stopped. Could not access the drive.

Hmm…

Remembered an old POST OF MY OWN.

SATA cable plugs sure do wiggle in their sockets. A lot more than old IDE ribbon cables.

Powered down. Removed power cable. Opened case. Noted which SATA cables went from which socket on the motherboard to which drive. Removed them all, blew some dry air into the plugs and cable ends. Replaced the cables and gave them a good wiggle, then left them, making sure they were not getting tugged out or sideways by tension but were square in the sockets. This involved rerouting some cables so they were more comfortable, and tying a bunch of unused power plugs up out of the way.

Reboot. No error messages. Mount back up drive. Copy 240+ GB of backups onto blank drive. All faultless. Seems to work perfectly.

Take home message: SATA cables are fussy and can cause problems that might look like something worse.

Something worse.

Is it better to go off-line when teaching?

Students, just like most of us including me, are too distractible, especially younger ones lacking self discipline, and by younger I mean first year university, not genuinely young. These days we put the content and the tutorial questions on the Learning Management System (LMS, really just a website) and we tell them to use the LMS to access the questions and the supporting materials and such. Once upon a time they’d just get a bunch of photocopies (‘photostats’) or before that roneos (mimeographs) or just “copy this down off the board.” I’m not pining for the past, I’m trying to work out how we can combine the best of then and now.

What happened then was we’d come to class having not looked at anything beforehand, we’d copy down a bunch of questions or question numbers off the blackboard (it wasn’t a whiteboard) like ‘Ch 8 Q 12-18’) then we’d have the book open in front of us and we’d whisper to each other while we were supposed to be working out the answers. Hmm.

What happens now is this:

They come to class having not looked at anything beforehand (just like in the old days), because the know they can access it when they get there (we knew we’d be given it when we got there, back in the day, so no difference there). But, and this is different now, they then spend ten minutes getting onto the university network and getting distracted by Facebook or whatever and don’t download the questions until the tutorial is half over. Then they get out their notebook (or tablet and stylus) and read the question and… check their messages. Then they show the guy sitting next to them a cat video. Then they laugh and eat some Skittles (fine, fine, that is not the internet’s fault), then they look at Pinterest or for all I know Tinder, and then I ask them how they’re going and they mumble and we’re over half way through now and they have written down a few bits of data pertaining to the first question and that’s it.

Okay, maybe I’m overstating, but I have seen it happen that way. I’m not just fighting any innate apathy or disinterest (or depression or sense of futility) to get them to do the work, I am fighting the single most interesting thing the human race has ever constructed — a world wide distraction machine that has everything on it and available at the touch of a screen.

At best, even when they are doing some physics or mathematics, their attention is divided — they are always ready to pounce on an alert from whatever bit of social media they use, so their brain is never really thinking about the questions we give them to (we hope) help them learn.

Now, in the past when you copied a question off the board, it went in your eyes, through your brain and out your fingers onto the paper. I’m not sure that’s much better than not engaging with it at all, but it can’t be worse. You could only really talk to the people either side of you, just as students can now, so there were by definition fewer distractions because now there are all the ones I had as a student plus smart phones, so at the very least students now have more distractions. Do they deal with them better than I used to? Valid question. Maybe these days they have extra information, extra connectivity, and the ability to use that without being consumed by it.

I’m not sure.

I started thinking about this post while I stood there watching students flick away from Snapchat (or whatever it was) and back to the LMS whenever they saw me coming. A few were able to use the ‘net to find useful information, or a website with some helpful content, and that’s good because a working scientist or problem solver (engineer, IT, whatever) does just that, calling on the info around them as well as what they know. But those students were a small minority.

I recall thinking how I would really, really like to given them all a paper copy of the questions or, better, ask them to bring their own copies (then at least they would have looked at it to the extent of downloading and printing it off and getting it from the printer with their own actual physical fingers before they got there — does that count as ‘engagement’?), and then use just their notebook, their bog basic calculator and their textbook (they still exist, they do!) to tackle the problems.

I don’t say the web is useless. It is great for communication, for extra activities and resources. They can use the web to access the material easily and flexibly when they are not in my class. I use it to distribute videos to buttress the material, to direct them to external resources, though Britney Spears’ Guide to Semiconductor Physics is getting a little behind the zeitgeist now… The WWW ought to be great for collaboration, for ready access to what the students have not internalised. For simulations, for VR, for virtual laboratories, for Skype visits to major laboratories, for feedback, for interaction, for… the sky is the limit.

But not if you can’t sit still long enough to actually do it.

We’ve tried to engage the students to make them want to be there. I mean, that should solve everything. And there’s always a few who do want to be there and  that’s great, they learn almost regardless of what the teachers do. But some students are in the class because they have been told to be there, because the subject is a prerequisite for what the really want, because they thought they would like it and now it’s too late to drop out without recording a fail, whatever. By giving them the option to more easily be mentally elsewhere when they have not developed the self-discipline to choose to do what needs to be done, I’m not sure we’re helping. I wonder if more distraction-free classroom time would have its benefits as part of a broader suite of learning opportunities? Some of the environments would use all the tech at our disposal, and some would just have the student and their brain and the stuff to be tackled.

I just want the best of both worlds; is that too much to ask?

 

Old fart, I am.

Why I Dislike Metrics

Metrics are used to measure a researcher’s output. How many publications? Patents? Students? Where are they publishing? Are they being cited? How many dollars in grants are they pulling in?

It’s tricky, because researchers at universities do need to be held accountable for the money invested in them — and the opportunity given to them that may have been given to another. Yet the outcomes of research can be diffuse, slow to materialise and hard to evaluate. A great conceptual breakthrough may have little impact initially. The investigator may have been fired by the time it is recognised. How does a non-expert administrator (who holds the purse strings) distinguish between a researcher who is ahead of the curve, and so not being cited because there are few others working on similar ideas, and one who is poorly cited because they are simply dull? Both are likely to have a tough time getting grant money, too.

Such an administrator falls back on metrics. Impact factors, grant dollars accrued, and so on. Complex formulas are developed. So much for a publication in one of these journals, less for one in these; citation rates are multiplied by this and divided by that, papers with a lot of authors [are|are not] (choose one) down-rated…and when government agencies that dole out grant money choose a particular metric, there’s really no choice.

Just looking at publications, once sheer numbers was the ‘in’ thing. Then it was citations. Then the H-index, the M-index, the insert-your-own-clever-metric-here-index, who knows. Now there are scores that mean publications in lower-ranked journals will actually count against a researcher, such that when comparing two researchers, one with four papers in ‘top’ journals and one with four in top and three in middle, the latter would actually be penalised relative to the first.

I cannot understand how this can be considered equitable, reasonable or sensible. I recognise that it is better to have high impact than low. I recognise that staff who consistently fail to have high impact need to improve that record. I have no problem with that. But the idea that a tail of papers in lower ranked journals is to be penalised is short-sighted, counter-productive and shows a lack of understanding about how science works. I will not speak for other fields.

(1) If I have a postgraduate student, or even an honours student, who had produced a nice result, a novel result, but not a high-impact result, I must now deny them the right to publish that result and build their publication record. They will finish their studies with fewer papers, less experience in writing up their work, a poor publication record, and less chance of employment. Writing for publication is a valuable part of a student’s training. By publishing a (possibly minor) paper extracted from their thesis before the thesis is submitted, a scholar gets feedback on their work and their writing ability from a wide audience, begins to build a profile, and can be more confident that the thesis will be passed because a component of it has already passed peer review.

(2) It would be easy for any such rules to be biased against staff publishing in certain areas. Who decides what is a ‘top’ journal? How is this harmonised across fields? Some fields are oddly replete with high-ranking journals and some have a dearth. This needs to be recognised.

(3) Science is a dialogue, a discussion. Many important results come from bringing together many small results. By forcing staff to only publish their highest-impact work, many results that might be useful to other workers in the field will never see the light of day, will never contribute to the debate. This holds back the field. To give a simple example, databases like the Inorganic Crystal Structure Database are populated by thousands of individually minor results. Most of these were not published in high-impact journals, yet data mining across that database and others has produced powerful results that are of great value. Great cathedrals can be made from many small bricks. This policy prevents those bricks from accumulating. It works against the fundamental (okay, and idealised) nature of science as a transparent, collaborative enterprise.

(4)  Building of collaborations will be inhibited. If I have a colleague at a university or facility (like a synchrotron, say) who is not subject to the same rules, they will quite reasonably say that a piece of work may not be ‘high-impact’, but is worth publishing nonetheless, and I will have to either accept the impact on my own record or deny publication. That is hardly a great way to build a relationship.

(5) Metrics have a habit of being applied retrospectively. Evaluating my performance in 2015 or 2016 (or even further back) against criteria that were not in force or even available at the time is simply unethical. If organisations are going to use metrics, it is because they want to (a) select staff that are performing at a high level and (b) encourage staff to perform at what is considered to be a high level. Evaluating staff who have been trying to satisfy one regime against totally new criteria is unfair and unreasonable, yet happens all the time. There need to be grandfather clauses.

I fully agree that we need to do high-impact science. I fully agree that staff need to be encouraged to publish in top journals. But actively precluding publishing in lesser, but still sound, journals is short-sighted and dangerous, and an example of how the careless use of metrics is destructive. Perhaps metrics are a necessary evil, but I have yet to see whether they do more good than harm.

 

Pontification over.

nu and vee, I and l — design of physics books and modern fads for fonts

Rant.

As someone working in a technical field, I often feel like designers do not really appreciate the subtleties of notation and how to make it clear. In the title of this post, ‘I and l’ is upper case ‘eye’ and lower case ‘el’. Not that you can tell.

For example, because so many symbols are used, formulas can often contain symbols which might be mistaken for each other. The classic example is…ilb1

and here is the same formula using some sans serif fonts, using Microsoft Word…

ilb2

Now, this is not to criticise these fonts. They are just not designed for this job. It is the chooser of the font who is being a wee bit silly if these fonts are used in a mathematical document. An even trickier example is…

nuv

which I have produced in LaTeX, and the nu and vee are well-differentiated, but that is because the font was designed by someone (Knuth) with the express purpose of laying out mathematics.

If I was able to give advice to anyone out there designing a text with mathematics in it, it would be to look at the two letter/symbol pairs I have shown here, and make sure they can be told apart. If not, the font choice is a poor one and needs to be changed. And what is fashionable at the moment is irrelevant beside the need for clarity and the fight against ambiguity and lack of precision.

 

Rant over.

Reink sample: A few weeks later

Just posting a sample of text from reinked ribbon a week or two after doing it. Looks OK to me.

 

Small sample. Real colour. Low res.

Small sample. Real colour. Low res.

 

Hi res, well, 300dpi anyway.

Hi res, well, 300dpi anyway. On used paper, so there is some show-through.

So, it’s not very blue but it is blue. It is pretty even, and has actually improved since I did it. And it has had plenty of time to dry out and it hasn’t, even the bit of ribbon near the paper and which is exposed to the air. Maybe if I left the machine unused for weeks it would dry out, but so far it looks like the glycerine works.

Time wasters.

Ignorant reflections on US politics

But There’ve been some comments in the light of the recent US election about reforming/abolishing the electoral college — after all, apparently, like Gore in 2000, Clinton got more votes but they were in the wrong places and so Trump won. Votes in less populous states count more, and so on.

I would argue, based on what I’ve read and seen, that there are more important things. The US needs an electoral commission, independent of Federal and State governments, that makes sure polling booths are equally available to all groups across the country. Voting systems need to be uniform. There need to be as many booths in non-white communities as in white, for example, and they need to be open. It needs to be easy to vote by post, in advance and from overseas. Recall Florida 2000 — Federal votes need to be immune from state-based interests.

That is a far bigger factor than the EC. I mean, sure, reform that — but you need to fix the bigger problems first. It needs to be easier to vote, equally easy everywhere. Once everyone gets to vote under a more uniform system, then you can get the inputs into the EC system to be more representative. At the moment the EC is GIGO, Fix the garbage going in first.

Second, once you’ve made it easy for everyone to vote, you would (ideally, though this will never ever ever ever ever ever ever ever ever ever ever happen) introduce what I call ‘compulsory attendance’. Put simply, you can fail to vote if you want to, but you have to show up (or postal vote or whatever) and tick the Brewster’s Millions box (draw a funny face on the ballot paper, whatever) or the Feds come after you. Failing to vote through apathy, or showing up and choosing not to vote for anybody because they all suck are two very different things, and send two very different messages, and what it does — and this is very important — is it removes the effect of voter turnout. Getting out the vote is discounted as a factor. That means there can be more focus on policy and genuine comparison of the parties. And you have to appeal to a wider range of voters, which tends to cut down on the more extreme ideas like Mexican walls. it is not undemocratic, because you do not have to vote, you just have to actually choose to not vote, rather than just be lazy.

But there’s no way you can have a law like that until everyone has an equal chance to vote, and right now that’s not the case.

Reinking a typewriter ribbon II: My crazy experiments yield something

This follows on from part I.

So the next thing to do was get a clean glass jar, drop in twenty or so drops of ink and one of glycerine, then mix. Well, here is a numbered list of things:

(1) I made up a small jig to hold two spools. The spools were not parallel enough, but it was okay for a first pass.

Do a little jig.

Do a little jig. I just wound the spools by hand.

(2) Between the spools I put a stamp pad, and I loaded the stamp pad with some Artline stamp pad ink that had been mixed with glycerine, the latter simply bought from a supermarket, from the cake-making aisle. The ratio of ink to glycerine was about 10:1, possibly richer in ink than that.

Ingredients.

Ingredients.

(3) I used a glass jar held on its side to push the ribbon against the pad as I wound the ribbon from one spool to the other. I recharged the pad a couple of times on the way.

(4) When I was done I dropped a little excess ink on the tightly-wound spool and let it soak in, just because I had some left. I wanted to see if the glycerine stopped the ink from drying out too quickly, so it was better to err on the side of having extra ink, so that I could be sure that a simple lack of ink was not the problem.

(5) I note that the ribbon is quite old (metal spools) and rather frayed which leads to stray strands of nylon flopping around and giving unwanted spidery lines on the page. Can’t blame the reinking for that.

(6) I found that immediately after reinking it worked pretty well. An hour later it still worked just as well. Next day even the exposed bit of ribbon was still usable, so it seems to work! I seem to have put too much ink into the ribbon, and not as uniform as I would have liked, so probably not much good for serious work, but for a few notes and whatnot it would be fine, and with a bit of trial and error I think I’ll be able to get ribbons that can do almost a well as a bought one, and in some funky colours. Can even experiment with buying a lightweight half inch nylon ribbon and inking it.

Not exactly brilliant, but a good place to start.

Not exactly brilliant, but a good place to start. Scan is b&w so not blue at all.

I am wondering if a different brand of ink — Horse brand comes to mind — would not need the glycerine added. I’d be interested to hear.

Live and don’t learn.

NaNoWriMo 2016: InNoReMo

Well, I posted on NaNoWriMo in 2014 and 2015 so I thought I ought to in 2016 even though I’m not doing it this year. Why not do it this year?…

…Because I already don’t revise and rewrite enough. I have done very little with the 100,000 words that are the legacy of he last two years, and while you can’t polish a turd, I need to ensure that they are irredeemable before I erase them from existence and churn out more verbiage. And if they are not irredeemable I should do some work on them.

So, for me this is not NaNoWriMo, it is perhaps NaNoRewriMo. And given that the event it now international, it should really be InNoRewriMo, but that is even less pronounceable than the original. ‘InNoReMo’?

(Note to event organisers in the USA: ‘The USA’ is not synonymous with ‘the world’. Events that happen outside the USA are not ‘national’ USA events, and when only teams from the USA are eligible to compete, it is not a ‘World Series’.)

I stand by my conclusion that the event has some value but only for people who have reasonable expectations. Should one of the 50k blodges of text I blatted out in previous years turn into something decent, the event will have done something worthwhile for me. Since I wrote them largely to gain some idea of how I can handle a longer narrative, I got some useful experience. Since I did not write them expecting to produce salable copy, I was not disappointed when I did indeed produce drek.

But that’s enough drek for now.

 

Toons.