Friday, October 12, 2018

The Good Emperor

Oh, I didn't realise this (the bit about never visiting the shrine) about Japanese Emperor Akihito:
The chief priest at Japan's controversial Yasukuni Shrine is to resign after making remarks highly critical of Emperor Akihito.

In comments leaked to a magazine, Kunio Kohori said he believed Emperor Akihito was trying to destroy the shrine by not visiting it.

The shrine in Tokyo honours Japan's 2.5 million war dead but also enshrines convicted criminals of World War Two.

It remains a high source of tension with neighbours, particularly China.

Emperor Akihito, who will abdicate next year, has never visited the shrine.

He has instead sought reconciliation with Japan's wartime enemies.

He has expressed regret over Japan's military actions in both China and the Korean peninsula, and has also visited several Pacific battlefields to honour the dead, actions that have brought him into conflict with right-wing groups at home.

Thursday, October 11, 2018

Video violence and empathy

It was back in March that I last had a whinge about unnecessary blood letting and violence in video games.   In that post, I complained about how a couple of studies were claimed to show no connection between games and real life violence, but when you looked at the details, it was pretty ludicrous to conclude they were particularly useful studies at all.

Now I see that I had missed another study that came out at the end of last year, claiming to show that frequent violent video game players have lower empathy response.

Well, that's more aligned with my biases!

Anyway, the study was very technical in nature, and used EEGs and tests regarding looking at faces, etc.  All very technical.  As usual, with all of this sort of testing, it is best to treat it with a high degree of caution, but the test set up does sound a little less obtuse than that in the studies I complained about.   The abstract follows: 
Research on the effects of media violence exposure has shown robust associations among violent media exposure, increased aggressive behavior, and decreased empathy. Preliminary research indicates that frequent players of violent video games may have differences in emotional and cognitive processes compared to infrequent or nonplayers, yet research examining the amount and content of game play and the relation of these factors with affective and cognitive outcomes is limited. The present study measured neural correlates of response inhibition in the context of implicit attention to emotion, and how these factors are related to empathic responding in frequent and infrequent players of video games with graphically violent content. Participants completed a self-report measure of empathy as well as an affective stop-signal task that measured implicit attention to emotion and response inhibition during electroencephalography. Frequent players had lower levels of empathy as well as a reduction in brain activity as indicated by P100 and N200/P300 event related potentials. Reduced P100 amplitude evoked by happy facial expressions was observed in frequent players compared to infrequent players, and this effect was moderated by empathy, such that low levels of empathy further reduced P100 amplitudes for happy facial expressions for frequent players compared to infrequent players. Compared to infrequent players, frequent players had reduced N200/P300 amplitude during response inhibition, indicating less neural resources were recruited to inhibit behavior. Results from the present study illustrate that chronic exposure to violent video games modulates empathy and related neural correlates associated with affect and cognition.




Godless Episode 3

I'm still finding it is well acted and looks terrific, but I have too issues with episode 3:

*  too horsey;

* this show is starting to trigger my "why does Hollywood add so many splattery bullet-to-the-head shots in entertainment now?" complaint.   In fact, they are putting a lot of blood sprays in many shooting scenes - I really suspect that this is caused by contamination from video gaming aesthetics.  I have no way of checking this, but I very much doubt that in the 19th century, there was much to be seen by way of blood spray from your average bullet wound, at least to the body.  But because people are used to seeing huge blood sprays from any bullet wound in video games, they are inserting it in all shows now.

Rare paralysis

Seems to be an unusual spike in a rare kid's paralysis in Minnesota.   A connection with a viral infection (mild of itself) seems likely, but it's interesting how long it can take to work out what causes what, medically.

Wednesday, October 10, 2018

More about food safety in the 19th century

My recent post about ridiculously dangerous milk was all due to reviews of Deborah Blum's book The Poison Squad, and from an NPR interview about it, I extract this:
You tell stories of kids dying from eating candy that was contaminated with lead. Given that this was causing real suffering in consumers, what kinds of arguments were people making for leaving this unregulated?

It's baffling, because you are in this period where food makers are knowingly using very bad things. I gave the example of arsenic, which was a green food dye also used to make the shellac that glosses up chocolate. But lead was used to color candies, and red lead was used in cheese. If people wanted to make a beautiful, orange cheddar cheese, they just dumped a little red lead in it. This is not people who didn't know it was bad, but there were things that made it permissible. There were no labels, and so there was no public pressure. It was just a pre-regulatory Wild West of food that permitted bad actors to do what they will, and so they did. It saved them a lot of money. You get this capitalistic feedback loop of people who were trying to make a living – and wanting to make more of a living. The consumer was both the guinea pig and the victim.

To no one's surprise, if you feed people formaldehyde, or arsenic or lead, they will get sick. And when you demonstrate that, why does it still remain so difficult to outlaw these substances in food? 

The food industry had been organizing itself to fight regulation. Wiley had been advocating and working with congressmen to get some kind of basic consumer protection. And these experiments caught national attention — they were front-page news, there were songs about them — and everyone was realizing that there is a lot of bad stuff in their food. There was an immediate pushback. Suddenly, congressmen are on the side of food business or getting offered more money. The food industry organizes to create a Food Manufacturers Association. They were phenomenally effective. They did a great job trying to damage Wiley's reputation publicly and deny what he was finding, and bullied and threatened congressmen to kill regulation every time it came up.
If Catallaxy was still a blog where you could usefully argue about libertarianism as a political philosophy, I would be commenting there about this.

But now it's just full of ratbags, and it's even hard to goad Jason to comment here...

Now that Nordhaus wins a Nobel, people are remembering Pindyck

ATTP has a post up in which he wonders out loud about an issue I've long complained about in relation to climate change impacts:
However, I do think there are reasons to be cautious about some of these economic analyses. Let me provide a caveat up front. I’m not an expert at this, so am happy to be corrected if I get something wrong, and am partly writing this in the hope that I might learn something more.

For starters, these analyses are typically linear. This essentially means that they can say nothing about the possibility of some kind of large shock. Some of these analyses actually suggest the possibility of quite small global economic impacts even for extremely large changes in climate (see links below), which would seem to suggest that there is some point at which these calculations break down.

Also, as I understand it, most of these analyses do not consider how climate change might impact economic growth itself (see this Carbon Brief Explainer about IAMs). If the global economy grows at 3% per year, then it will be about 10 times bigger in 2100 than it is today. A large economic impact in 2100, might then seem small by comparison to the global economy at that time. Equivalently, if you discount these future economic costs to today, they can also seem quite small. Is it reasonable to assume that global economic growth will be largely unaffected by climate change?

My own view, which I’m happy to be convinced is wrong, is that these kind of analyses are fine if you want to understand things like what would happen if we did something (like impose a carbon tax). They’re probably also fine if you’re interested in how the economy will response to relatively small climate and ecological perturbations, or will respond over the next few decades. Where I think we should be more cautious is when the climate/ecological perturbations are large, or when considering very long, multi-decade timescales.
 And someone in comments reminds him that Pindyck has been saying this for some years now.

I remain quietly confident that in the next decade or so, the general view will be "come on, why did we ever thing the economic modelling of climate change was realistic?" 


Well, his briefings do have to be given as stick figure illustrations...


I'm sure his very big brain will come to the right conclusions...

An amusing tweet

In reaction to this story:  
Gay students and teachers could be rejected by religious schools under changes to anti-discrimination laws being recommended by a federal review into religious freedom, according to a media report.
The former attorney general Philip Ruddock, who chaired the review, said the right of schools to turn away gay students and teachers should be enshrined in the Sex Discrimination Act.
this tweet:


Tuesday, October 09, 2018

A minor point about the Kavanaugh fight

One thing very clear about the Right's reaction to Christine Ford's testimony was that her voice drove them nuts - I've read scores of comments that her voice was too "little girl" to be real - it was all an act, and/or proof that she's mentally disturbed or an emotional wreck.  One of the other.

(Oh, and how ludicrous are the wingnut "body language" videos that have become a thing in the last couple of years, with their ridiculous appeal to expertise in uncovering the secret meaning of the body language of figures the Right  want to imagine destroyed.   It's just pathetic, but they really believe an area of "science" that was never more than flim flam from hucksters.  The one on Ford was particularly welcomed with open arms.)

Anyway, I thought - if there is any truth at all that her vocals at the hearing was an act, there would be surely be someone, somewhere who has sat in a lecture or talk of hers and recorded it, and could prove that she sounds completely different in "real life".   What's the bet that there were Wingnuts desperate to turn up such material to attack her credibility.

But where are we now:  oh, that's right - no sign of any other voice recording has surfaced anywhere.  Despite her being an academic who (I presume) has had lecturing as part of her job for much of her career? 

I think it extremely safe to assume that no evidence will ever arise - because it never existed, and it was a case of wingnut's imagination run wild - again.


The near perfect quote on Trumpian propaganda

Somehow I had missed previously reading about Hannah Arendt's comments on the use of propaganda in the rise of totalitarianism.  But I saw quote extracted on Twitter, and it's absolutely perfect to describe how Trumpian lying is working with his "base":
The totalitarian mass leaders based their propaganda on the correct psychological assumption that, under such conditions, one could make people believe the most fantastic statements one day, and trust that if the next day they were given irrefutable proof of their falsehood, they would take refuge in cynicism; instead of deserting the leaders who had lied to them, they would protest that they had known all along that the statement was a lie and would admire the leaders for their superior tactical cleverness.
However, these days, it's not even clear that they will ever believe it was even a lie.  

I got the quote from an Open Culture post which explains:
Arendt, on the other hand, looked closely at the regimes of Hitler and Stalin and their functionaries, at the ideology of scientific racism, and at the mechanism of propaganda in fostering “a curiously varying mixture of gullibility and cynicism with which each member... is expected to react to the changing lying statements of the leaders.” So she wrote in her 1951 Origins of Totalitarianism, going on to elaborate that this “mixture of gullibility and cynicism... is prevalent in all ranks of totalitarian movements"
And, not that the term was around at the time she was writing it, but there is a deep irony in how those on the Right who decried post-modernism for its "truth is a mere social construct" attitude are now the side who have most comprehensively swallowed that kool aid without even realising it.     


Comedy noted

I quite like Conan O'Brien:  certainly he deserves some sort of award for weirdest looking dude to ever have a successful late night talk show.  His comedy writers seem to specialise in coming up with some pretty silly and eccentric stuff, and I particularly enjoyed a couple of sketches recently on Youtube:



Andy Richter is often very funny in his own right:



Did I note here before that I was surprised to read that Conan spoke in 2015 about getting treatment for depression and anxiety?

Well, now I see, in what must be yet another indication that getting into comedy and having psychological problems seems to go hand in hand to an extraordinary degree, Andy Richter spoke about lifelong issues with depression last year.  

Anyway, I see that Conan is changing his show to a half hour format next year.   I hope that helps him de-stress a bit.     


Monday, October 08, 2018

Things an over-active imagination can see happening

*  Brett Kavanaugh getting sacked after getting absurdly drunk at a Supreme Court welcome function, and being found in a dimly lit corner of the room trying to put "the hard word" on Ruth Ginsburg.

* That Winx horse's career coming to an end by it literally exploding just before it passes the post for what would be its 50th win.  

* Trump's hair catching fire, purely by the power of his own lies, during one of his Nurembergs.  

The civil war only one side wants

Hey, there's a very impressive tweet thread by one David Neiwert, who has studied the American  "coming civil war" rhetoric that's been ramped up by Wingnuts for the last decade or two.

Now, they have Trump, conspiracy nutter, who is lending them moral support by talking like this:
Democrats have become too EXTREME and TOO DANGEROUS to govern. Republicans believe in the rule of law - not the rule of the mob.
Yeah, sure, says the guy who laughs whenever his mini Nuremberg rallies break into "lock her up" chants. 

As far as I can recall the Republicans have precisely one act of politically inspired nutter shooting to get excited about - the baseball shooting in June 2017.   (Even so, which side of politics is the one that would want to toughen gun restrictions on nutters such as that guy?  Which side is devoted to supporting paranoid wingnuts in being armed to the teeth with military grade weaponry?)    The rest of it is handwringing over noisy protests outside of restaurants, or in front of Congress or the Supreme Court.

Republicans have conveniently become drama queens over the state of civil unrest spurned by their own politics:  race riots in the 1960's could end up with scores of deaths, and massive amounts of destruction.  Even in 1977, I see that the New York City blackout resulted in this:
In all, 1,616 stores were damaged in looting and rioting. A total of 1,037 fires were responded to, including 14 multiple-alarm fires. In the largest mass arrest in city history, 3,776 people were arrested
The state of civil unrest in the US is relatively mild, and it is part of the tribalist authoritarian nuttiness of the Right that they continually are trying to convince themselves that mainstream Democrats are dangerous socialists who will DESTROY the economy, the nation, etc.

It's absurd, and actually dangerous when their most ardent believers are weaponised.  

Update:   A typical example of absurd "reasonable Republican" commentary from Hugh Hewitt in WAPO, alleging that the real problem is incivility:
 Trump is as wearying today as Andrew Jackson must have been in 1829 to the people of both parties who are used to different rules sets. I am one of them. Thus my criticisms of the president are many and detailed. But my fear of the wild-eyed left is far greater than my discomfort with his bull-in-china shop politics.

The left, we saw this week and last, contrasts unfavorably with the president’s hyperbole and occasional cruelty. It is now a snarling, enraged collective scream. To give it power would be to risk fraying even further the common bonds of citizenship.
The fundamental problem for Democrats and the rest of the world:   you're trying to fight idiots.   

The comments following that will be white hot, I bet.


Mice can work it out

There's something charming about a study looking into how mice couples communicate after one of them has been "unfaithful":
The quality of conversations between California mice couples after one partner has been unfaithful can help predict which mouse pairs will successfully produce a litter of mouse pups and which males are good fathers, according to a new study on the evolution of monogamy.

More unimportant pop culture notes

*  All Australian males over the age of 40 have a crush on Julia Zemero, don't they?  (Homer excepted, if I remember correctly.)   Hence I found myself watching the singing competition show All Together Now, which features a panel of "music industry figures" judging the contestants.  (The only big name on which is Ronan Keating, who I admit is a likeable TV presence.)

But as for the rest of the judges - who knew the Australian music business is (if you were to judge it by this show) completely dominated by gay/camp personalities?    It reminded me of the unknown D grade celebrities Britain manages to scrape up from somewhere for shows like "I'm a Celebrity, Get me Out of Here".   In fact, I see this show is a format import from Britain, so perhaps I shouldn't be surprised.

I don't know:  the set is pretty, and the judges are 3/4 ridiculous, and Julia is still funny sometimes.  I might watch it again.

*  My daughter likes a lot of Justin Bieber songs, but thinks he's nuts.  Yes, I like to point out:  he is a living example of having excessive money, especially while young, causing more problems than it solves.  (A point I like to make often to justify my own less than desired income.)  She said the other day that maybe he's a "good boy" again, since he went back to his church.   Seems not to be true

I liked this short interview with Hard Quiz host Tom Gleeson.  He is funny, and I am happy to understand how they warn guests about his style.

A minor observation

As I wrote below, I was watching that Batman v Superman movie on Friday and noted Holly Hunter was in it.  I saw her in something else recently, but I forget what it was.

I've always liked her, but in these last two appearances, I thought she had a somewhat (I don't know) slurry? issue with her speech which I had never noticed before.  It reminded me very much of what I noticed in Carrie Fisher in the last two Star Wars movies.   It made me think that perhaps it's what an ill fitting partial denture can make a person sound like.  Yet surely they wouldn't have dentures but would go with implants if a tooth needed replacing.

I haven't noticed anyone else saying this about either of these actors, but I find it obvious in both of them (although more pronounced in Fisher).

Am I alone in this? 

Sunday, October 07, 2018

When milk was positively dangerous

The Atlantic had a brief extract from a book that has just come out, about food contamination in the 19th century:
We tend to think of our 19th-century forefathers thriving on farm-fresh produce and pasture-raised livestock, happily unaffected by the deceptive food-manufacturing practices of today. In this we are wrong. Milk offers a stunning case in point. By mid-century, the standard, profit-maximizing recipe was a pint of lukewarm water for every quart of milk—after the cream had been skimmed off. To whiten the bluish liquid, dairymen added plaster of paris and chalk, or a dollop of molasses for a creamy gold. To replace the skimmed-off layer of cream, they might add a final flourish of pureed calf brains.
Mmmm..calves brains.

More on this somewhat nauseating topic of just how bad commercial milk was in those days can be found in a much lengthier article in the Smithsonian magazine.  Oddly, calves brains were probably the least of a consumer's reason to worry.  But first, the brains:
But there were other factors besides risky strains of bacteria that made 19th century milk untrustworthy. The worst of these were the many tricks that dairymen used to increase their profits. Far too often, not only in Indiana but nationwide, dairy producers thinned milk with water (sometimes containing a little gelatin), and recolored the resulting bluish-gray liquid with dyes, chalk, or plaster dust.

They also faked the look of rich cream by using a yellowish layer of pureed calf brains. As a historian of the Indiana health department wrote: “People could not be induced to eat brain sandwiches in [a] sufficient amount to use all the brains, and so a new market was devised.”

“Surprisingly enough,’’ he added, “it really did look like cream but it coagulated when poured into hot coffee.”
Gosh.
 
Anyway, the worse thing was the use of formaldehyde:
Finally, if the milk was threatening to sour, dairymen added formaldehyde, an embalming compound long used by funeral parlors, to stop the decomposition, also relying on its slightly sweet taste to improve the flavor. In the late 1890s, formaldehyde was so widely used by the dairy and meat-packing industries that outbreaks of illnesses related to the preservative were routinely described by newspapers as “embalmed meat” or “embalmed milk” scandals.

Indianapolis at the time offered a near-perfect case study in all the dangers of milk in America, one that was unfortunately linked to hundreds of deaths and highlighted not only Hurty’s point about sanitation but the often lethal risks of food and drink before federal safety regulations came into place in 1906.

In late 1900, Hurty’s health department published such a blistering analysis of locally produced milk that The Indianapolis News titled its resulting article “Worms and Moss in Milk.” The finding came from an analysis of a pint bottle handed over by a family alarmed by signs that their milk was “wriggling.” It turned out to be worms, which investigators found had been introduced when a local dairyman thinned the milk with ‘’stagnant water.”....

[a few paras about the horrible bacteriological state of milk at that time go here] 

The heating of a liquid to 120 to 140 degrees Fahrenheit for about 20 minutes to kill pathogenic bacteria was first reported by the French microbiologist Louis Pasteur in the 1850s. But although the process would later be named pasteurization in his honor, Pasteur’s focus was actually on wine. It was more than 20 years later that the German chemist Franz von Soxhlet would propose the same treatment for milk. In 1899, the Harvard microbiologist Theobald Smith — known for his discovery of Salmonella — also argued for this, after showing that pasteurization could kill some of the most stubborn pathogens in milk, such as the bovine tubercle bacillus.

But pasteurization would not become standard procedure in the United States until the 1930s, and even American doctors resisted the idea. The year before Smith announced his discovery, the American Pediatric Society erroneously warned that feeding babies heated milk could lead them to develop scurvy.

Such attitudes encouraged the dairy industry to deal with milk’s bacterial problems simply by dumping formaldehyde into the mix. And although Hurty would later become a passionate advocate of pasteurization, at first he endorsed the idea of chemical preservatives.
In 1896, desperately concerned about diseases linked to pathogens in milk, he even endorsed formaldehyde as a good preservative. The recommended dose of two drops of formalin (a mix of 40 percent formaldehyde and 60 percent water) could preserve a pint of milk for several days. It was a tiny amount, Hurty said, and he thought it might make the product safer.

But the amounts were often far from tiny. Thanks to Hurty, Indiana passed the Pure Food Law in 1899 but the state provided no money for enforcement or testing. So dairymen began increasing the dose of formaldehyde, seeking to keep their product “fresh” for as long as possible. Chemical companies came up with new formaldehyde mixtures with innocuous names such as Iceline or Preservaline. (The latter was said to keep a pint of milk fresh for up to 10 days.) And as the dairy industry increased the amount of preservatives, the milk became more and more toxic.
In the summer of 1900, The Indianapolis News reported on the deaths of three infants in the city’s orphanage due to formaldehyde poisoning. A further investigation indicated that at least 30 children had died two years prior due to use of the preservative, and in 1901, Hurty himself referenced the deaths of more than 400 children due to a combination of formaldehyde, dirt, and bacteria in milk.
Following that outbreak, the state began prosecuting dairymen for using formaldehyde and, at least briefly, reduced the practice. But it wasn’t until Harvey Wiley and his allies helped secure the federal Pure Food and Drug Act in 1906 that the compound was at last banned from the food supply.
It really was a different world back then.

And once again, you have to ask - how the hell do libertarians have the hide to argue that their philosophy works, in practice?

Two unnecessary movie reviews

What's that?  Drunk, rich fratboy became a judge at the Supreme Court after all?   I have brief comments to make about that, but later.

Meantime, perhaps I can't review it fairly, but I did try watching Superman V Batman Batman v Superman (sorry, I care so little about it I got the name around wrong) on free-to-air TV on Friday.

As I tried explaining to my son (who likes Christopher Nolan's Batmen and talks about wanting to see the Joker movie), I can't get engaged with any incarnation of Batman.   There's just a wall of superhero scenario credibility that I can't break through for this character - I find Superman and Wonderwoman more believable despite the silliness of the former's physics and the latter's mythological status.   Apart from not caring for dark angst as a key feature of a superhero character, I reckon Batman's problems in large part revolve around the super-villains:  Lex Luther or even Green Goblin are more credible than a Gotham City full of Batman level ridiculous costumed superheroes.

That said, even starting on the basis that I would not enjoy it, Ben Affleck's Batman seemed particularly bad:  body too chunky, personality too charmless, and the deep, rasping voice in costume particularly over the top.  

Looking at director's Zack Snyder's body of (directorial) work, I can safely say he has a sensibility that in no way appeals to me: dour; in DC world - determined to treat Superman as a God/Jesus stand in; and even cinematography that grates.

Anyway, I fell asleep just as the titular fight scene was set up, and I kept half waking for what seemed an eternity of loud noise and CGI fire and explosions.  My son got bored before it started too, and went off to have a shower.  I woke to see funeral scenes for Superman.  I don't think I missed anything that would change my mind that it was a dud movie:  which is pretty much my reaction to anything featuring Batman.

The second more positive review:

Solo - the first serious commercial flop of the Star Wars universe.

I thought it was OK story and acting wise, but there was clear room for improvement (with emphasis on the word "clear", as you will shortly understand).

It would seem everyone suspects the first directors were likely sacked for not treating the material reverentially enough.  But really, I think it could have benefited from more laughs.   It wasn't without humour, and I liked one big joke near the end in particular, but I still think a few more big laughs would have lightened it up more.

And speaking of light - what was going on with so much murky cinematography?   I know that home LCD TVs can have an issue with low light scenes at the best of times, but I see now that people who saw it at the cinema were posting about how they found it distractingly dim too.  Someone wrote an article about how digital projection in cinemas was not being checked enough, and that's why it looked so dark in so many cinemas.

So, it's not just me - lots of people hated the lighting, and I would guess that it alone accounted for a lot of poor word of mouth.  Who is this cinematographer Bradford Young?  Oh, he's a black, young-ish guy, and he doesn't seem to have done anything else I have seen except Arrival. I wasn't overly impressed with the looks of that movie either - but he clearly seems to like working with fog and mist.

Honestly, they shouldn't have sacked the directors - they should have sacked Young.

Having said that, in CGI terms, when they were bright enough, I thought a lot of the film looked pretty terrific.  But good CGI in certain sequences is not enough to bring in a crowd these days.   (It pretty much used to be - when they first started to be deployed in the late seventies.)

So, more or less worth seeing, and I'm sort of sorry that it seems to have killed the potential for a sequel in the Han Solo story.







Friday, October 05, 2018

The Guardian asks the hard, important question...

Why is the gay leather scene dying? 

(By the way, I've barely skimmed the article, which seems to go into considerable detail about "the scene" in London.)   

Wednesday, October 03, 2018

It was a lot of cannabis, but still..

While the cannabis industry becomes legal in the US, in South East Asia it is still taken very seriously:
The West Jakarta District Court handed down the death penalty on Tuesday to Rizky Albar, 29, and Rocky Siahaan, 37, for their roles in smuggling 1.3 tons of cannabis to Jakarta from Aceh in December last year.
The verdicts were read in two separate hearings.

"There are no mitigating factors. The defendant is sentenced to death," presiding judge Agus Setiawan said as he read Rizky's verdict, kompas.com reported.

The panel of judges found him guilty of being the right hand of an infamous drug dealer named Iwan — who currently remains at large — in the drug's smuggling scheme.