@davorg davorg pushed to master in davorg/uptime · October 15, 2024 00:28
1 commit to master
  • @upptime-bot 27b7540
    🍱 Update graphs [skip ci]
@davorg davorg pushed to master in davorg/uptime · October 15, 2024 00:08
2 commits to master
  • @upptime-bot d1fd72a
    🗃️ Update status summary [skip ci] [upptime]
  • @upptime-bot 0a0e810
    📝 Update summary in README [skip ci] [upptime]
@davorg davorg pushed to master in davorg/uptime · October 14, 2024 23:01
2 commits to master

Watched on Monday October 14, 2024.

@davorg davorg pushed to master in davorg/standup · October 14, 2024 09:13
1 commit to master
@davorg davorg pushed to master in davorg/standup · October 14, 2024 09:10
1 commit to master

Watched on Sunday October 13, 2024.

Following a bumpy launch, Eric scrambles to maintain control over the floor and ensure Lumi's IPO is a success. Meanwhile, Harper forms a new work alliance, Robert suffers a devastating loss, and Yasmin’s ingenuity wins Henry’s attention.
As Pierpoint preps for an IPO, Yasmin tries to defy assumptions, Robert mismanages Lumi’s CEO, and Harper plots to get back in the game.
Tom buys an old-fashioned range for their kitchen from a passing rag-and-bone man, but Barbara becomes irate when she ends up doing most of the hard work to get it ready for use, while he tries to make a system for scaring the birds off their crops.
Amol Rajan is host, with guest panellists, actor and comedian Miles Jupp and Dame Andrea Jenkyns, joining team captains Paul Merton and Ian Hislop as they delve into the news.
With enemies closing in, the coven flees under the light of a blood moon to the next ghastly trial.

Watched on Tuesday October 8, 2024.

Watched on Sunday October 6, 2024.

Watched on Sunday September 29, 2024.

Over the last few months, I’ve been dabbling in using AI to generate or improve code. I have a subscription to GitHub Copilot and I’m finding it a really useful tool for increasing my productivity. Copilot comes in several different flavours, and I’ve been making particular use of a couple of them.

  • Copilot Autocomplete was the first Copilot tool that GitHub released. Once you’ve configured your editor to use it, the AI will read the file you’re working on and will monitor what you’re typing. When it thinks it knows what you’re doing and what comes next, it will display a suggestion for the next chunk of code and if you like what you see, you can just hit the tab key to accept it. I’ve been pleasantly surprised by how well it does. I’ve had cases where I’ve just typed the name of a method and it has autocompleted the code for me.
  • Copilot Chat was the next version to be released. This is a chat box that sits alongside your code where you can talk to the AI about what you’re doing and ask it for suggestions. This is great for taking on larger projects. I’ve found it particularly useful for working on front-end code. I can usually make CSS and Javascript do what I want, but asking Copilot for suggestions makes me an order of magnitude quicker.

Those two tools alone make me a more efficient programmer. And they’re well worth the $10 a month I pay for my Copilot subscription. But recently I was invited to the preview of Copilot Workspace. And that’s a whole new level. Copilot Workspace takes a GitHub issue as its input and returns a complete, multifile pull request that implements the required change. I’ve been playing with it for small tweaks, but I decided the time was right to do something more substantial. I planned to write an entire Dancer app by defining issues and asking Copilot to implement the code. Here’s what happened. You can follow along at the GitHub repo.

I decided I would start from the standard, automatically generated Dancer2 app. So I ran dancer2 gen -a Example and committed the output from that. It was then time for the first issue. I decided to start by adding (empty) routes for user registration and login. I opened the issue in the Copilot Workspace and asked the AI for some suggested code. It didn’t really understand the idea of empty routes – but the pull request seemed pretty good. I merged the PR and moved on to the next issue – to add basic registration and login screens. Again, the pull request did a little more than I asked for – adding a bit more registration and login logic – but the code was good.

As an aside, you’ll notice that the PRs are all correctly linked to the correct issues and contain substantial information about the changes. This is all generated by the AI.

For the next step, we needed a database table to store the users. I asked Copilot to use SQLite and it gave me what I wanted – once again, going above and beyond. For the first time, its overenthusiasm was slightly annoying, because it added some database code to store new users and I hadn’t told it that we would be using DBIx::Class. So that was the next issue and the next pull request. Note that the pull request even includes adding DBIx::Class to the requisites in Makefile.PL.

Time for some unit tests (ok, maybe the best time was a few PRs ago!). The issue description was simple – “Write unit tests for everything we have so far“. Maybe it was too simple – as this was the first time the AI seemed to struggle a bit. I was merging the PRs without really checking them and the PR introduced a lot of useful tests – but many of them failed. Part of the problem here is that (as far as I can see) Copilot Workspace has no way to run the code it produces – so it was guessing how well it was doing. It took a few iterations to get that right – it basically boiled down to the database schema not being loaded into the database before the tests were run. At times while we were working through these problems, I was reminded of someone (I think it was Simon Willison) describing an AI programming assistant as “an overconfident, overenthusiastic intern”. Luckily, unlike an intern, Copilot never gets annoyed with you telling it to try again and providing more and more information to help it get to the bottom of a problem.

After a while, we had a working test suite and were back on track.

So we were back at adding features to the application. I decided the next thing we needed was to display the logged-in user’s username and email address on the main page. That seemed simple enough and worked first time. About this time I was getting annoyed with the standard Dancer2 web page, so we removed most of that. Then I switched from Dancer’s default “simple” templating system to the Template Toolkit [issue / PR].

While we were tidying up the look and feel, we added login and logout buttons [issue / PR] and a register button on the logged out page [issue / PR]. This led to some more confusion for a while as logging out didn’t work. It turned out the AI had used outdated code to destroy the session and I had to get very specific before it would do the right thing [issue / PR].

We then added some more tests [issue / PR], displayed registration and login errors [issue / PR] and ensured we were storing the passwords in encrypted form (to be honest, I’m slightly disappointed that the AI didn’t do that by default) [issue / PR].

At this point (and I don’t know why I didn’t do it sooner), we replaced the UI with something using Bootstrap [issue / PR]. That led to a bit more tweaking of the buttons [issue / PR].

At this point, I had basically got to where I wanted to be. I had an app that didn’t do anything useful, but let you register, login and log out. And I’d done it all pretty quickly and without writing very much code.

Then I decided to push it too far.

The thing that I actually wanted to achieve at this point was to add social registration and login to the site. I created an issue – Allow users to register and login using a Google account – and Copilot gave me some code. But at this point, it’s not just about code. You also need to configure stuff at Google in order to get this working. And, while Copilot gave me some information about what I needed to do, I haven’t yet been able to get it working. This is a good example of the limitations of AI-powered programming. It’s great at generating code, but (so far, at least) not so good at keeping up to date with how to interface with external systems. Oh, and there’s the problem we saw earlier about it not actually running the tests.

So, how do I think the experiment went? I was impressed. There was a lot of code generated that was as good or better than I would have written myself. There are certainly the problems that I mentioned above, but this stuff is improving at such an incredible rate that I really can’t see those problems still existing in a year.

I’ve started using Copilot Workspace for a lot more of my projects. And I’m happy with the results I’ve got.

What about you? Have you used any version of Copilot to help with your coding? How successful has it been?

The post Dancing with Copilot Workspace appeared first on Perl Hacks.

Over the last few months, I’ve been dabbling in using AI to generate or improve code. I have a subscription to GitHub Copilot and I’m finding it a really useful tool for increasing my productivity. Copilot comes in several different flavours, and I’ve been making particular use of a couple of them.

  • Copilot Autocomplete was the first Copilot tool that GitHub released. Once you’ve configured your editor to use it, the AI will read the file you’re working on and will monitor what you’re typing. When it thinks it knows what you’re doing and what comes next, it will display a suggestion for the next chunk of code and if you like what you see, you can just hit the tab key to accept it. I’ve been pleasantly surprised by how well it does. I’ve had cases where I’ve just typed the name of a method and it has autocompleted the code for me.
  • Copilot Chat was the next version to be released. This is a chat box that sits alongside your code where you can talk to the AI about what you’re doing and ask it for suggestions. This is great for taking on larger projects. I’ve found it particularly useful for working on front-end code. I can usually make CSS and Javascript do what I want, but asking Copilot for suggestions makes me an order of magnitude quicker.

Those two tools alone make me a more efficient programmer. And they’re well worth the $10 a month I pay for my Copilot subscription. But recently I was invited to the preview of Copilot Workspace. And that’s a whole new level. Copilot Workspace takes a GitHub issue as its input and returns a complete, multifile pull request that implements the required change. I’ve been playing with it for small tweaks, but I decided the time was right to do something more substantial. I planned to write an entire Dancer app by defining issues and asking Copilot to implement the code. Here’s what happened. You can follow along at the GitHub repo.

I decided I would start from the standard, automatically generated Dancer2 app. So I ran dancer2 gen -a Example and committed the output from that. It was then time for the first issue. I decided to start by adding (empty) routes for user registration and login. I opened the issue in the Copilot Workspace and asked the AI for some suggested code. It didn’t really understand the idea of empty routes – but the pull request seemed pretty good. I merged the PR and moved on to the next issue – to add basic registration and login screens. Again, the pull request did a little more than I asked for – adding a bit more registration and login logic – but the code was good.

As an aside, you’ll notice that the PRs are all correctly linked to the correct issues and contain substantial information about the changes. This is all generated by the AI.

For the next step, we needed a database table to store the users. I asked Copilot to use SQLite and it gave me what I wanted – once again, going above and beyond. For the first time, its overenthusiasm was slightly annoying, because it added some database code to store new users and I hadn’t told it that we would be using DBIx::Class. So that was the next issue and the next pull request. Note that the pull request even includes adding DBIx::Class to the requisites in Makefile.PL.

Time for some unit tests (ok, maybe the best time was a few PRs ago!). The issue description was simple – “Write unit tests for everything we have so far“. Maybe it was too simple – as this was the first time the AI seemed to struggle a bit. I was merging the PRs without really checking them and the PR introduced a lot of useful tests – but many of them failed. Part of the problem here is that (as far as I can see) Copilot Workspace has no way to run the code it produces – so it was guessing how well it was doing. It took a few iterations to get that right – it basically boiled down to the database schema not being loaded into the database before the tests were run. At times while we were working through these problems, I was reminded of someone (I think it was Simon Willison) describing an AI programming assistant as “an overconfident, overenthusiastic intern”. Luckily, unlike an intern, Copilot never gets annoyed with you telling it to try again and providing more and more information to help it get to the bottom of a problem.

After a while, we had a working test suite and were back on track.

So we were back at adding features to the application. I decided the next thing we needed was to display the logged-in user’s username and email address on the main page. That seemed simple enough and worked first time. About this time I was getting annoyed with the standard Dancer2 web page, so we removed most of that. Then I switched from Dancer’s default “simple” templating system to the Template Toolkit [issue / PR].

While we were tidying up the look and feel, we added login and logout buttons [issue / PR] and a register button on the logged out page [issue / PR]. This led to some more confusion for a while as logging out didn’t work. It turned out the AI had used outdated code to destroy the session and I had to get very specific before it would do the right thing [issue / PR].

We then added some more tests [issue / PR], displayed registration and login errors [issue / PR] and ensured we were storing the passwords in encrypted form (to be honest, I’m slightly disappointed that the AI didn’t do that by default) [issue / PR].

At this point (and I don’t know why I didn’t do it sooner), we replaced the UI with something using Bootstrap [issue / PR]. That led to a bit more tweaking of the buttons [issue / PR].

At this point, I had basically got to where I wanted to be. I had an app that didn’t do anything useful, but let you register, login and log out. And I’d done it all pretty quickly and without writing very much code.

Then I decided to push it too far.

The thing that I actually wanted to achieve at this point was to add social registration and login to the site. I created an issue – Allow users to register and login using a Google account – and Copilot gave me some code. But at this point, it’s not just about code. You also need to configure stuff at Google in order to get this working. And, while Copilot gave me some information about what I needed to do, I haven’t yet been able to get it working. This is a good example of the limitations of AI-powered programming. It’s great at generating code, but (so far, at least) not so good at keeping up to date with how to interface with external systems. Oh, and there’s the problem we saw earlier about it not actually running the tests.

So, how do I think the experiment went? I was impressed. There was a lot of code generated that was as good or better than I would have written myself. There are certainly the problems that I mentioned above, but this stuff is improving at such an incredible rate that I really can’t see those problems still existing in a year.

I’ve started using Copilot Workspace for a lot more of my projects. And I’m happy with the results I’ve got.

What about you? Have you used any version of Copilot to help with your coding? How successful has it been?

The post Dancing with Copilot Workspace first appeared on Perl Hacks.

We need programmers who like to play on the bleading edge. By trying out new features, they are able to report on problems that they find – and, in doing so, improve the experience for the many people who follow them.

I’m not usually much of a bleading edge programmer. But I’ve been enjoying Perl’s new object-oriented programming features, so I’ve been using them a lot. And, in the process, I’ve found a few issues that I’ve reported (or, in a couple of cases, will report) to the relevant people.

Often, the problem that the bleading edgers come across are problems with the feature itself. That’s not the case with me. I’ve been finding problems with how Perl’s infrastructure deals with the new feature.

And please note, it would be easy to interpret this blog post as me complaining about these tools being “broken” because they aren’t keeping up with the development of the language. That’s not the case at all. I realise that these infrastructure projects are all run by volunteers and I’m grateful for all these people do – working for free, keeping these systems (systems that we often tend to take for granted) running. In cases where I think I would be at all useful I have, of course, offered my helping in implementing these fixes.

So what are the problems?

The first CPAN module I wrote that used the new class syntax was Amazon::Sites. As soon as I uploaded it, I knew something was awry. I got an email from the PAUSE indexer saying that it couldn’t understand my distribution tarball. I wasn’t sure what the problem was, but within an hour I got a follow-up email from Neil Bowers pointing out that PAUSE couldn’t find a package statement in my module. That’s not surprising, as the new class syntax uses class as a replacement for package. And PAUSE hadn’t been updated to recognise that syntax. Before emailing me, Neil had taken the time to raise an issue in the PAUSE repo and he suggested that the upcoming Perl Toolchain Summit would be a good opportunity to fix the problem. He also suggested that added a (strictly speaking, spurious) package line to the code would be a good workaround. I did that and uploaded a new version – which worked fine. And PAUSE was updated at the PTS. In the intervening time, I released a couple more modules that used the new syntax – so they also have the extra package line.

The next problem is one that probably only affects me. Back in January, I wrote about some reusable GitHub Actions that I had developed for Perl code. Although it’s not mentioned in the blog post, I had added an action that uses Perl::Metrics::Simple to report on the complexity of my Perl code. I noticed that it was showing strange results for my modules that used the new syntax. Specifically, it wasn’t correctly reporting the complexity of code in methods. The reason is obvious, when you think about it. It’s just that Perl::Metrics::Simple doesn’t recognise the method keyword that is used in place of sub in the new OO syntax. I raised an issue in the repo for the module – optimistically promising a pull request in a few days. That didn’t happen as the problem is actually in PPI – which Perl::Metrics::Simple uses to parse the code. And there’s already a ticket to add all of the new keywords to PPI. Sadly, I don’t think my Perl is up to taking on this fix for the PPI team.

Given that the PAUSE issue I mentioned before has now been fixed, when I came to release App::LastStats recently I didn’t add the extra package line that had become my habit. It turns out that was a mistake. While my new module sailed past PAUSE, it seems that the lack of a package definition confuses MetaCPAN too. While my new module was being indexed by PAUSE and ending up in the 02Packages file correctly (so it was installable using tools like cpanm), it wasn’t appearing in MetaCPAN search or on my author page. Chatting with Olaf Alders on the #metacpan IRC channel, he spotted that the status of the release wasn’t being set to “latest” by the MetaCPAN ingestion code. Adding the same package line to the code soon fixed that problem too. Hopefully I’ll be able to work out where to fix the MetaCPAN code so it recognises class as a synonym for package. But, until that happens, anyone uploading a module to CPAN that uses the new syntax (is that really only me?) will need to add the package line.

There’s one more class of problem that I’m still trying to work out. And that’s down to my use of Feature::Compat::Class to make these modules compatible with versions of Perl that don’t support the new syntax. Part of the problem here is that we now have two versions of Perl that support the new syntax – 5.38 and 5.40. But they support slightly different versions of the syntax – that’s to be expected, of course; it’s how the new feature is being written.

The way that Feature::Compat::Class works is that it checks the version of Perl and if it is running on a version less than 5.38, then it loads another module called Object::Pad – which is a test bed for the new class syntax. Object::Pad supports more of the planned new syntax that has been actually released yet. So when Feature::Compat::Class loads Object::Pad, it uses a flag which tells Object::Pad to only allow the syntax that has been released in a Perl release. But which syntax? From which release? I guess it depends on which version of Object::Pad I’m using. Presumably, a version that was released after Perl 5.40 will support all of 5.40’s new syntax. And if I write code that uses the newest syntax, what happens when someone tries to run it on Perl 5.38? Currently, I’m only using 5.38’s syntax, so I’m not sure yet. And this is a problem that will get worse as future versions of Perl add more features to the class syntax.

I don’t think my new modules have many users – they’re very niche, so this is probably only a problem that I need to solve for myself. And I’m solving it by running the code in Docker containers that have the latest version of Perl installed. But it’s something I’ll need to think about more deeply if any of these modules become more widely used. Maybe I just encourage people to use them via the Docker images.

Oh, one final thing. The new class syntax is experimental. Some people would, I suppose, say that’s a good reason not to use it in CPAN module – but, hey, bleading edge 🙂 But that means it produces loads of “experimental” warnings if you don’t explicitly add code to suppress then. That code is no warnings 'experimental::class'. But that doesn’t compile on a Perl earlier than 5.38 (because it’s not a recognised warning category on a version of Perl where the feature is unimplemented). So I need to look at using if to only turn off those warnings on the correct versions of Perl.

I don’t want to put anyone off using the new class syntax. I think it’s a great new tool and I’m looking forward to seeing it become more powerful as each new version of Perl is released. I just want people to realise that you will hit certain speedbumps by being an early adopter of features like this.

Have you tried the new syntax? What do you think of it?

The post On the [b]leading edge appeared first on Perl Hacks.

We need programmers who like to play on the bleading edge. By trying out new features, they are able to report on problems that they find – and, in doing so, improve the experience for the many people who follow them.

I’m not usually much of a bleading edge programmer. But I’ve been enjoying Perl’s new object-oriented programming features, so I’ve been using them a lot. And, in the process, I’ve found a few issues that I’ve reported (or, in a couple of cases, will report) to the relevant people.

Often, the problem that the bleading edgers come across are problems with the feature itself. That’s not the case with me. I’ve been finding problems with how Perl’s infrastructure deals with the new feature.

And please note, it would be easy to interpret this blog post as me complaining about these tools being “broken” because they aren’t keeping up with the development of the language. That’s not the case at all. I realise that these infrastructure projects are all run by volunteers and I’m grateful for all these people do – working for free, keeping these systems (systems that we often tend to take for granted) running. In cases where I think I would be at all useful I have, of course, offered my helping in implementing these fixes.

So what are the problems?

The first CPAN module I wrote that used the new class syntax was Amazon::Sites. As soon as I uploaded it, I knew something was awry. I got an email from the PAUSE indexer saying that it couldn’t understand my distribution tarball. I wasn’t sure what the problem was, but within an hour I got a follow-up email from Neil Bowers pointing out that PAUSE couldn’t find a package statement in my module. That’s not surprising, as the new class syntax uses class as a replacement for package. And PAUSE hadn’t been updated to recognise that syntax. Before emailing me, Neil had taken the time to raise an issue in the PAUSE repo and he suggested that the upcoming Perl Toolchain Summit would be a good opportunity to fix the problem. He also suggested that added a (strictly speaking, spurious) package line to the code would be a good workaround. I did that and uploaded a new version – which worked fine. And PAUSE was updated at the PTS. In the intervening time, I released a couple more modules that used the new syntax – so they also have the extra package line.

The next problem is one that probably only affects me. Back in January, I wrote about some reusable GitHub Actions that I had developed for Perl code. Although it’s not mentioned in the blog post, I had added an action that uses Perl::Metrics::Simple to report on the complexity of my Perl code. I noticed that it was showing strange results for my modules that used the new syntax. Specifically, it wasn’t correctly reporting the complexity of code in methods. The reason is obvious, when you think about it. It’s just that Perl::Metrics::Simple doesn’t recognise the method keyword that is used in place of sub in the new OO syntax. I raised an issue in the repo for the module – optimistically promising a pull request in a few days. That didn’t happen as the problem is actually in PPI – which Perl::Metrics::Simple uses to parse the code. And there’s already a ticket to add all of the new keywords to PPI. Sadly, I don’t think my Perl is up to taking on this fix for the PPI team.

Given that the PAUSE issue I mentioned before has now been fixed, when I came to release App::LastStats recently I didn’t add the extra package line that had become my habit. It turns out that was a mistake. While my new module sailed past PAUSE, it seems that the lack of a package definition confuses MetaCPAN too. While my new module was being indexed by PAUSE and ending up in the 02Packages file correctly (so it was installable using tools like cpanm), it wasn’t appearing in MetaCPAN search or on my author page. Chatting with Olaf Alders on the #metacpan IRC channel, he spotted that the status of the release wasn’t being set to “latest” by the MetaCPAN ingestion code. Adding the same package line to the code soon fixed that problem too. Hopefully I’ll be able to work out where to fix the MetaCPAN code so it recognises class as a synonym for package. But, until that happens, anyone uploading a module to CPAN that uses the new syntax (is that really only me?) will need to add the package line.

There’s one more class of problem that I’m still trying to work out. And that’s down to my use of Feature::Compat::Class to make these modules compatible with versions of Perl that don’t support the new syntax. Part of the problem here is that we now have two versions of Perl that support the new syntax – 5.38 and 5.40. But they support slightly different versions of the syntax – that’s to be expected, of course; it’s how the new feature is being written.

The way that Feature::Compat::Class works is that it checks the version of Perl and if it is running on a version less than 5.38, then it loads another module called Object::Pad – which is a test bed for the new class syntax. Object::Pad supports more of the planned new syntax that has been actually released yet. So when Feature::Compat::Class loads Object::Pad, it uses a flag which tells Object::Pad to only allow the syntax that has been released in a Perl release. But which syntax? From which release? I guess it depends on which version of Object::Pad I’m using. Presumably, a version that was released after Perl 5.40 will support all of 5.40’s new syntax. And if I write code that uses the newest syntax, what happens when someone tries to run it on Perl 5.38? Currently, I’m only using 5.38’s syntax, so I’m not sure yet. And this is a problem that will get worse as future versions of Perl add more features to the class syntax.

I don’t think my new modules have many users – they’re very niche, so this is probably only a problem that I need to solve for myself. And I’m solving it by running the code in Docker containers that have the latest version of Perl installed. But it’s something I’ll need to think about more deeply if any of these modules become more widely used. Maybe I just encourage people to use them via the Docker images.

Oh, one final thing. The new class syntax is experimental. Some people would, I suppose, say that’s a good reason not to use it in CPAN module – but, hey, bleading edge :-) But that means it produces loads of “experimental” warnings if you don’t explicitly add code to suppress then. That code is no warnings 'experimental::class'. But that doesn’t compile on a Perl earlier than 5.38 (because it’s not a recognised warning category on a version of Perl where the feature is unimplemented). So I need to look at using if to only turn off those warnings on the correct versions of Perl.

I don’t want to put anyone off using the new class syntax. I think it’s a great new tool and I’m looking forward to seeing it become more powerful as each new version of Perl is released. I just want people to realise that you will hit certain speedbumps by being an early adopter of features like this.

Have you tried the new syntax? What do you think of it?

The post On the [b]leading edge first appeared on Perl Hacks.

Back in May, I wrote a blog post about how I had moved a number of Dancer2 applications to a new server and had, in the process, created a standardised procedure for deploying Dancer2 apps. It’s been about six weeks since I did that and I thought it would be useful to give a little update on how it all went and talk about a few little changes I’ve made.

I mentioned that I was moving the apps to a new server. What I didn’t say was that I was convinced my old server was overpowered (and overpriced!) for what I needed, so the new server has less RAM and, I think, a slower CPU than the old one. And that turned out to be a bit of a problem. It turned out there was a time early each morning when there were too many requests coming into the server and it ran out of memory. I was waking up most days to a dead server. My previous work meant that fixing it wasn’t hard, but it really wasn’t something that I wanted to do most mornings.

So I wanted to look into reducing the amount of memory used by the apps. And that turned out to be a two-stage approach.

You might recall that the apps were all controlled using a standardised driver program called “app_service”. It looked like this:

#!/usr/bin/env perl
 
use warnings;
use strict;
use Daemon::Control;
 
use ENV::Util -load_dotenv;
 
use Cwd qw(abs_path);
use File::Basename;
 
Daemon::Control->new({
  name      => ucfirst lc $ENV{KLORTHO_APP_NAME},
  lsb_start => '$syslog $remote_fs',
  lsb_stop  => '$syslog',
  lsb_sdesc => 'Advice from Klortho',
  lsb_desc  => 'Klortho knows programming. Listen to Klortho',
  path      => abs_path($0),
 
  program      => '/usr/bin/starman',
  program_args => [ '--workers', 10, '-l', ":$ENV{KLORTHO_APP_PORT}",
                    dirname(abs_path($0)) . '/app.psgi' ],
 
  user  => $ENV{KLORTHO_OWNER},
  group => $ENV{KLORTHO_GROUP},
 
  pid_file    => "/var/run/$ENV{KLORTHO_APP_NAME}.pid",
  stderr_file => "$ENV{KLORTHO_LOG_DIR}/error.log",
  stdout_file => "$ENV{KLORTHO_LOG_DIR}/output.log",
 
  fork => 2,
})->run;

We’re deferring most of the clever stuff to Daemon::Control. But we’re building the parameters to pass to the constructor. And two of the parameters (“program” and “program_args”) control how the service is run. You’ll see I’m using Starman. The first fix was obvious when you look at my code. Starman is a pre-forking server and we always start with 10 copies of the app. Now, I’m very proud of some of my apps, but I think it’s optimistic to think my Klortho server will ever need to respond to 10 simultaneous requests. Honestly, I’m pleasantly surprised if it gets 10 requests in a month. So the first change was to make it easy to change the number of workers.

In the previous article, I talked about using ENV::Util to load environment variables from a “.env” file. And we can continue to use that approach here. I rewrote the “program_args” code to be this:

program_args => [ '--workers', ($ENV{KLORTHO_APP_WORKERS} // 10), '-l', ":$ENV{KLORTHO_APP_PORT}",
                  dirname(abs_path($0)) . '/app.psgi' ],

Here we look for an environment variable (defined in “.env”) and use either that value or a default of 10.

I made similar changes to all the “app_service” files, added appropriate environment variables to all the “.env” files and restarted all the apps. Immediately, I could see an improvement as I was now running maybe a third of the app processes on the server. But I felt I could do betters. So I had a close look at the Starman documentation to see if there was anything else I could tweak. That’s when I found the “–preload-app” command-line option.

Starman works by loading a main driver process which then fires up as many worker processes as you ask it for. Without the “–preload-app” option, each of those processes loads a copy of the application. But with this option, each worker process reads the main driver’s copy of the application and only loads its own copy when it wants to write something. This can be a big memory saving – although it’s important to note that the documentation warns:

Enabling this option can cause bad things happen when resources like sockets or database connections are opened at load time by the master process and shared by multiple children.

I’m pretty sure that most of my apps are not in any danger here, but I’m keeping a close eye on the situation and if I see any problems, it’s easy enough to turn preloading off again.

When adding the preloading option to “app_service”, I realised I should probably completely rewrite the code that builds the program arguments. It now looks like this:

my @program_args;
if ($ENV{KLORTHO_WORKER_COUNT}) {
  push @program_args, '--workers', $ENV{KLORTHO_WORKER_COUNT};
}
if ($ENV{KLORTHO_APP_PORT}) {
  push @program_args, '-l', ":$ENV{KLORTHO_APP_PORT}";
}
if ($ENV{KLORTHO_APP_PRELOAD}) {
  push @program_args, '--preload-app';
}
push @program_args, dirname(abs_path($0)) . '/bin/app.psgi';

The observant among you will notice that I’ve subtly changed the behaviour of the worker count environment variable. Previously, a missing variable would use a default value of 10. Now, it just omits the argument which uses Starman’s default value of 5.

I’ve made similar changes in all my “app_service” programs and set environment variables to turn preloading on. And now my apps use substantially less memory. The server hasn’t died since I implemented this stuff at the start of this week. So that makes me very happy.

But programming is the pursuit of minimisation. I’ve already seen two places where I can make these programs smaller and simpler.

  1. That last code snippet looks too repetitive. It should be a loop iterating over a hash. The keys are the names of the environment variables and the values are references to arrays containing the values that are added to the program arguments if that environment variable is set.
  2. I now have five or six “app_service” programs that look very similar. I must be able to turn them into one standard program. Do those environment variables really need to include the application name?

The Klortho service driver program is on GitHub. Can you suggest any more improvements?

The post Deploying Dancer Apps (Addendum) appeared first on Perl Hacks.

Back in May, I wrote a blog post about how I had moved a number of Dancer2 applications to a new server and had, in the process, created a standardised procedure for deploying Dancer2 apps. It’s been about six weeks since I did that and I thought it would be useful to give a little update on how it all went and talk about a few little changes I’ve made.

I mentioned that I was moving the apps to a new server. What I didn’t say was that I was convinced my old server was overpowered (and overpriced!) for what I needed, so the new server has less RAM and, I think, a slower CPU than the old one. And that turned out to be a bit of a problem. It turned out there was a time early each morning when there were too many requests coming into the server and it ran out of memory. I was waking up most days to a dead server. My previous work meant that fixing it wasn’t hard, but it really wasn’t something that I wanted to do most mornings.

So I wanted to look into reducing the amount of memory used by the apps. And that turned out to be a two-stage approach.

You might recall that the apps were all controlled using a standardised driver program called “app_service”. It looked like this:

#!/usr/bin/env perl

use warnings;
use strict;
use Daemon::Control;

use ENV::Util -load_dotenv;

use Cwd qw(abs_path);
use File::Basename;

Daemon::Control->new({
  name => ucfirst lc $ENV{KLORTHO_APP_NAME},
  lsb_start => '$syslog $remote_fs',
  lsb_stop => '$syslog',
  lsb_sdesc => 'Advice from Klortho',
  lsb_desc => 'Klortho knows programming. Listen to Klortho',
  path => abs_path($0),

  program => '/usr/bin/starman',
  program_args => [ '--workers', 10, '-l', ":$ENV{KLORTHO_APP_PORT}",
                    dirname(abs_path($0)) . '/app.psgi' ],

  user => $ENV{KLORTHO_OWNER},
  group => $ENV{KLORTHO_GROUP},

  pid_file => "/var/run/$ENV{KLORTHO_APP_NAME}.pid",
  stderr_file => "$ENV{KLORTHO_LOG_DIR}/error.log",
  stdout_file => "$ENV{KLORTHO_LOG_DIR}/output.log",

  fork => 2,
})->run;

We’re deferring most of the clever stuff to Daemon::Control. But we’re building the parameters to pass to the constructor. And two of the parameters (“program” and “program_args”) control how the service is run. You’ll see I’m using Starman. The first fix was obvious when you look at my code. Starman is a pre-forking server and we always start with 10 copies of the app. Now, I’m very proud of some of my apps, but I think it’s optimistic to think my Klortho server will ever need to respond to 10 simultaneous requests. Honestly, I’m pleasantly surprised if it gets 10 requests in a month. So the first change was to make it easy to change the number of workers.

In the previous article, I talked about using ENV::Util to load environment variables from a “.env” file. And we can continue to use that approach here. I rewrote the “program_args” code to be this:

program_args => [ '--workers', ($ENV{KLORTHO_APP_WORKERS} // 10),
                  '-l', ":$ENV{KLORTHO_APP_PORT}",
                  dirname(abs_path($0)) . '/app.psgi' ],

Here we look for an environment variable (defined in “.env”) and use either that value or a default of 10.

I made similar changes to all the “app_service” files, added appropriate environment variables to all the “.env” files and restarted all the apps. Immediately, I could see an improvement as I was now running maybe a third of the app processes on the server. But I felt I could do betters. So I had a close look at the Starman documentation to see if there was anything else I could tweak. That’s when I found the “–preload-app” command-line option.

Starman works by loading a main driver process which then fires up as many worker processes as you ask it for. Without the “–preload-app” option, each of those processes loads a copy of the application. But with this option, each worker process reads the main driver’s copy of the application and only loads its own copy when it wants to write something. This can be a big memory saving – although it’s important to note that the documentation warns:

Enabling this option can cause bad things happen when resources like sockets or database connections are opened at load time by the master process and shared by multiple children.

I’m pretty sure that most of my apps are not in any danger here, but I’m keeping a close eye on the situation and if I see any problems, it’s easy enough to turn preloading off again.

When adding the preloading option to “app_service”, I realised I should probably completely rewrite the code that builds the program arguments. It now looks like this:

my @program_args;
if ($ENV{KLORTHO_WORKER_COUNT}) {
  push @program_args, '--workers', $ENV{KLORTHO_WORKER_COUNT};
}
if ($ENV{KLORTHO_APP_PORT}) {
  push @program_args, '-l', ":$ENV{KLORTHO_APP_PORT}";
}
if ($ENV{KLORTHO_APP_PRELOAD}) {
  push @program_args, '--preload-app';
}
push @program_args, dirname(abs_path($0)) . '/bin/app.psgi';

The observant among you will notice that I’ve subtly changed the behaviour of the worker count environment variable. Previously, a missing variable would use a default value of 10. Now, it just omits the argument which uses Starman’s default value of 5.

I’ve made similar changes in all my “app_service” programs and set environment variables to turn preloading on. And now my apps use substantially less memory. The server hasn't died since I implemented this stuff at the start of this week. So that makes me very happy.

But programming is the pursuit of minimisation. I’ve already seen two places where I can make these programs smaller and simpler.

  1. That last code snippet looks too repetitive. It should be a loop iterating over a hash. The keys are the names of the environment variables and the values are references to arrays containing the values that are added to the program arguments if that environment variable is set.
  2. I now have five or six “app_service” programs that look very similar. I must be able to turn them into one standard program. Do those environment variables really need to include the application name?

The Klortho service driver program is on GitHub. Can you suggest any more improvements?

The post Deploying Dancer Apps (Addendum) first appeared on Perl Hacks.

One of the most popular posts I’ve written in recent months was the one where I talked about all the pointless personal projects I have. The consensus in the many comments I received was that anything you find useful isn’t pointless. And I can’t really argue with that.

But it’s nice when one of your projects is used by other people. And that happened to me recently.

The initial commit in mergecal is from 2016, but I strongly suspect it existed as code that wasn’t in source code control for several years before that. The idea behind it is simple enough. I wanted to be able to share my calendar with someone, but I didn’t have a single iCal file that I could share. For various complicated and yet dull reasons, my calendar is split across a number of separate iCal files. Initially, I remember thinking there must be an online service that will take a list of iCal calendars and produce a single, combined one. But a few hours on Google didn’t find anything so I did what any hacker would do and wrote my own.

It really wasn’t difficult. As usual, it was just a case of plumbing together a few CPAN modules. In this case, Text::vFile::asData did most of the heavy lifting – with JSON used to parse a configuration file. It can’t have taken more than an hour to write. And, as the commit history shows, very few subsequent changes were required. I just set it up with the correct configuration and a cronjob that rebuilt the combined calendar once a day and published it on my web site.

And then I forgot about it for years. The best kind of software.

Then, in January of this year, I got a pull request against the code. This astonished me. MY SOFTWARE HAD A USER. And in the PR, the user said “It boggles my mind that there is still no simpler free solution, even after all those years”.

So maybe this would be useful to a few more people. Perhaps I should market it better (where “better” means “at all”).

As a first step towards that, I’ve rewritten it and released it to CPAN as App::MergeCal. Maybe I should think about putting it online as some kind of web service.

Anyway, it makes me incredibly happy to know my software is used by even one person. Which reminds me – please take the time to say “thank you” to anyone whose software you find useful. It’s a small thing, but you’ll make someone very happy.

The post Combining calendars appeared first on Perl Hacks.

One of the most popular posts I’ve written in recent months was the one where I talked about all the pointless personal projects I have. The consensus in the many comments I received was that anything you find useful isn’t pointless. And I can’t really argue with that.

But it’s nice when one of your projects is used by other people. And that happened to me recently.

The initial commit in mergecal is from 2016, but I strongly suspect it existed as code that wasn’t in source code control for several years before that. The idea behind it is simple enough. I wanted to be able to share my calendar with someone, but I didn’t have a single iCal file that I could share. For various complicated and yet dull reasons, my calendar is split across a number of separate iCal files. Initially, I remember thinking there must be an online service that will take a list of iCal calendars and produce a single, combined one. But a few hours on Google didn’t find anything so I did what any hacker would do and wrote my own.

It really wasn’t difficult. As usual, it was just a case of plumbing together a few CPAN modules. In this case, Text::vFile::asData did most of the heavy lifting – with JSON used to parse a configuration file. It can’t have taken more than an hour to write. And, as the commit history shows, very few subsequent changes were required. I just set it up with the correct configuration and a cronjob that rebuilt the combined calendar once a day and published it on my web site.

And then I forgot about it for years. The best kind of software.

Then, in January of this year, I got a pull request against the code. This astonished me. MY SOFTWARE HAD A USER. And in the PR, the user said “It boggles my mind that there is still no simpler free solution, even after all those years”.

So maybe this would be useful to a few more people. Perhaps I should market it better (where “better” means “at all”).

As a first step towards that, I’ve rewritten it and released it to CPAN as App::MergeCal. Maybe I should think about putting it online as some kind of web service.

Anyway, it makes me incredibly happy to know my software is used by even one person. Which reminds me – please take the time to say “thank you” to anyone whose software you find useful. It’s a small thing, but you’ll make someone very happy.

The post Combining calendars first appeared on Perl Hacks.

Data Munging with Perl was published in February 2001. That was over 23 years ago. It’s even 10 years since Manning took the book out of print and the rights to the content reverted to me. Over that time, I’ve been to a lot of Perl conferences and met a lot of people who have bought and read the book. Many of them have been kind enough to say nice things about how useful they have found it. And many of those readers have followed up by asking if there would ever be a second edition.

My answer has always been the same. It’s a lot of effort to publish a book. The Perl book market (over the last ten years, at least) is pretty much dead. So I really didn’t think the amount of time I would need to invest in updating the book would be worth it for the number of sales I would get.

But times change.

You may have heard of Perl School. It’s a small publishing brand that I’ve been using to publish Perl ebooks for a few years. You may have even read the interview that brian d foy did with me for perl.com a few years ago about Perl School and the future of Perl publishing. In it, I talk a lot about how much easier (and, therefore, cheaper) it is to publish books when you’re just publishing ebook versions. I end the interview by inviting anyone to come to me with proposals for Perl School books, but brian is one of only two people who have ever taken me up on that invitation.

In fact, I haven’t really written enough Perl School books myself. There are only two – Perl Taster and The Best of Perl Hacks.

A month or so ago, brian was passing through London and we caught up over dinner. Of course, Perl books was one of the things we discussed and brian asked if I was ever going to write a second edition of Data Munging with Perl. I was about to launch into my standard denial when he reminded me that I had already extracted the text from the book into a series of Markdown files which would be an excellent place to start from. He also pointed out that most of the text was still relevant – it was just the Perl that would need to be updated.

I thought about that conversation over the next week or so and I’ve come to the conclusion that he was right. It’s actually not going to be that difficult to get a new edition out.

I think he was a little wrong though. I think there are a few more areas that need some work to bring the book up to date.

  • Perl itself has changed a lot since 2001. Version 5.6.0 was released while I was using the book – so I was mostly targeting 5.005 (that was the point at which the Perl version scheme was changed). I was using “-w” and bareword filehandles. It would be great to have a version that contains “use warnings” and uses lexical filehandles. There are dozens of other new Perl features that have been introduced in the last twenty years.
  • There are many new and better CPAN modules. I feel slightly embarrassed that the current edition contains examples that use Date::Manip and Date::Calc. I’d love to replace those with DateTime and Time::Piece. Similarly, I’d like to expand the section on DBI, so it also covers DBIx::Class. There’s a lot of room for improvement in this area.
  • And then there’s the way that the world of computing has changed. The current edition talks about HTTP “becoming ubiquitous” – which was an accurate prediction, but rather dates the book. There are discussions on things like FTP and NFS – stuff I haven’t used for years. And there are new things that the book doesn’t cover at all – file formats like YAML and JSON, for example.

The more I thought about it, the more I realised that I’d really like to see this book. I think the current version is still useful and contains good advice. But I don’t want to share it with many people because I worry that they would pick up an out-of-date idea of what constitutes best practices in Perl programming.

So that has now become my plan. Over the next couple of months, I’ll be digging through the existing book and changing it into something that I’m still proud to see people reading. I don’t want to predict when it will be ready, but I’d hope to have it released in the autumn.

I’d be interested to hear what you think about this plan. Have you read the book? Are there parts of it that you would like to see updated? What new syntax should I use? What new CPAN modules are essential?

Let me know what you think.

The post Bowing to the inevitable appeared first on Perl Hacks.

Data Munging with Perl was published in February 2001. That was over 23 years ago. It’s even 10 years since Manning took the book out of print and the rights to the content reverted to me. Over that time, I’ve been to a lot of Perl conferences and met a lot of people who have bought and read the book. Many of them have been kind enough to say nice things about how useful they have found it. And many of those readers have followed up by asking if there would ever be a second edition.

My answer has always been the same. It’s a lot of effort to publish a book. The Perl book market (over the last ten years, at least) is pretty much dead. So I really didn’t think the amount of time I would need to invest in updating the book would be worth it for the number of sales I would get.

But times change.

You may have heard of Perl School. It’s a small publishing brand that I’ve been using to publish Perl ebooks for a few years. You may have even read the interview that brian d foy did with me for perl.com a few years ago about Perl School and the future of Perl publishing. In it, I talk a lot about how much easier (and, therefore, cheaper) it is to publish books when you’re just publishing ebook versions. I end the interview by inviting anyone to come to me with proposals for Perl School books, but brian is one of only two people who have ever taken me up on that invitation.

In fact, I haven’t really written enough Perl School books myself. There are only two – Perl Taster and The Best of Perl Hacks.

A month or so ago, brian was passing through London and we caught up over dinner. Of course, Perl books was one of the things we discussed and brian asked if I was ever going to write a second edition of Data Munging with Perl. I was about to launch into my standard denial when he reminded me that I had already extracted the text from the book into a series of Markdown files which would be an excellent place to start from. He also pointed out that most of the text was still relevant – it was just the Perl that would need to be updated.

I thought about that conversation over the next week or so and I’ve come to the conclusion that he was right. It’s actually not going to be that difficult to get a new edition out.

I think he was a little wrong though. I think there are a few more areas that need some work to bring the book up to date.

  • Perl itself has changed a lot since 2001. Version 5.6.0 was released while I was using the book – so I was mostly targeting 5.005 (that was the point at which the Perl version scheme was changed). I was using “-w” and bareword filehandles. It would be great to have a version that contains “use warnings” and uses lexical filehandles. There are dozens of other new Perl features that have been introduced in the last twenty years.
  • There are many new and better CPAN modules. I feel slightly embarrassed that the current edition contains examples that use Date::Manip and Date::Calc. I’d love to replace those with DateTime and Time::Piece. Similarly, I’d like to expand the section on DBI, so it also covers DBIx::Class. There’s a lot of room for improvement in this area.
  • And then there’s the way that the world of computing has changed. The current edition talks about HTTP “becoming ubiquitous” – which was an accurate prediction, but rather dates the book. There are discussions on things like FTP and NFS – stuff I haven’t used for years. And there are new things that the book doesn’t cover at all – file formats like YAML and JSON, for example.

The more I thought about it, the more I realised that I’d really like to see this book. I think the current version is still useful and contains good advice. But I don’t want to share it with many people because I worry that they would pick up an out-of-date idea of what constitutes best practices in Perl programming.

So that has now become my plan. Over the next couple of months, I’ll be digging through the existing book and changing it into something that I’m still proud to see people reading. I don’t want to predict when it will be ready, but I’d hope to have it released in the autumn.

I’d be interested to hear what you think about this plan. Have you read the book? Are there parts of it that you would like to see updated? What new syntax should I use? What new CPAN modules are essential?

Let me know what you think.

The post Bowing to the inevitable first appeared on Perl Hacks.

Royal Titles Decoded: What Makes a Prince or Princess? — Line of Succession Blog

Letters Patent issued by George V in 1917

Royal titles in the United Kingdom carry a rich tapestry of history, embodying centuries of tradition while adapting to the changing landscape of the modern world. This article delves into the structure of these titles, focusing on significant changes made during the 20th and 21st centuries, and how these rules affect current royals.

The Foundations: Letters Patent of 1917

The framework for today’s royal titles was significantly shaped by the Letters Patent issued by King George V in 1917. This document was pivotal in redefining who in the royal family would be styled with “His or Her Royal Highness” (HRH) and as a prince or princess. Specifically, the 1917 Letters Patent restricted these styles to:

  • The sons and daughters of a sovereign.
  • The male-line grandchildren of a sovereign.
  • The eldest living son of the eldest son of the Prince of Wales.

This move was partly in response to the anti-German sentiment of World War I, aiming to streamline the monarchy and solidify its British identity by reducing the number of royals with German titles.

Notice that the definitions talk about “a sovereign”, not “the sovereign”. This means that when the sovereign changes, no-one will lose their royal title (for example, Prince Andrew is still the son of a sovereign, even though he is no longer the son of the sovereign). However, people can gain royal titles when the sovereign changes — we will see examples below.

Extension by George VI in 1948

Understanding the implications of the existing rules as his family grew, King George VI issued a new Letters Patent in 1948 to extend the style of HRH and prince/princess to the children of the future queen, Princess Elizabeth (later Queen Elizabeth II). This was crucial as, without this adjustment, Princess Elizabeth’s children would not automatically have become princes or princesses because they were not male-line grandchildren of the monarch. This ensured that Charles and Anne were born with princely status, despite being the female-line grandchildren of a monarch.

The Modern Adjustments: Queen Elizabeth II’s 2012 Update

Queen Elizabeth II’s update to the royal titles in 2012 before the birth of Prince William’s children was another significant modification. The Letters Patent of 2012 decreed that all the children of the eldest son of the Prince of Wales would hold the title of HRH and be styled as prince or princess, not just the eldest son. This move was in anticipation of changes brought about by the Succession to the Crown Act of 2013, which ended the system of male primogeniture, ensuring that the firstborn child of the Prince of Wales, regardless of gender, would be the direct heir to the throne. Without this change, there could have been a situation where Prince William’s first child (and the heir to the throne) was a daughter who wasn’t a princess, whereas her eldest (but younger) brother would have been a prince.

Impact on Current Royals

  • Children of Princess Anne: When Anne married Captain Mark Phillips in 1973, he was offered an earldom but declined it. Consequently, their children, Peter Phillips and Zara Tindall, were not born with any titles. This decision reflects Princess Anne’s preference for her children to have a more private life, albeit still active within the royal fold.
  • Children of Prince Edward: Initially, Prince Edward’s children were styled as children of an earl, despite his being a son of the sovereign. Recently, his son James assumed the courtesy title Earl of Wessex, which Prince Edward will inherit in due course from Prince Philip’s titles. His daughter, Lady Louise Windsor, was also styled in line with Edward’s wish for a lower-profile royal status for his children.
  • Children of Prince Harry: When Archie and Lilibet were born, they were not entitled to princely status or HRH. They were great-grandchildren of the monarch and, despite the Queen’s adjustments in 2012, their cousins — George, Charlotte and Louis — were the only great-grandchildren of the monarch with those titles. When their grandfather became king, they became male-line grandchildren of a monarch and, hence, a prince and a princess. It took a while for those changes to be reflected on the royal family website. This presumably gave the royal household time to reflect on the effect of the children’s parents withdrawing from royal life and moving to the USA.

Special Titles: Prince of Wales and Princess Royal

  • Prince of Wales: Historically granted to the heir apparent, this title is not automatic and needs to be specifically bestowed by the monarch. Prince Charles was created Prince of Wales in 1958, though he had been the heir apparent since 1952. Prince William, on the other hand, received the title in 2022 — just a day after the death of Queen Elizabeth II.
  • Princess Royal: This title is reserved for the sovereign’s eldest daughter but is not automatically reassigned when the previous holder passes away or when a new eldest daughter is born. Queen Elizabeth II was never Princess Royal because her aunt, Princess Mary, held the title during her lifetime. Princess Anne currently holds this title, having received it in 1987.

The Fade of Titles: Distant Royals

As the royal family branches out, descendants become too distanced from the throne, removing their entitlement to HRH and princely status. For example, the Duke of Gloucester, Duke of Kent, Prince Michael of Kent and Princess Alexandra all have princely status as male-line grandchildren of George V. Their children are all great-grandchildren of a monarch and, therefore, do not all have royal styles or titles. This reflects a natural trimming of the royal family tree, focusing the monarchy’s public role on those directly in line for succession.

Conclusion

The evolution of British royal titles reflects both adherence to deep-rooted traditions and responsiveness to modern expectations. These titles not only delineate the structure and hierarchy within the royal family but also adapt to changes in societal norms and the legal landscape, ensuring the British monarchy remains both respected and relevant in the contemporary era.

Originally published at https://blog.lineofsuccession.co.uk on April 25, 2024.


Royal Titles Decoded: What Makes a Prince or Princess? — Line of Succession Blog was originally published in Line of Succession on Medium, where people are continuing the conversation by highlighting and responding to this story.

The view of the planet [AI-generated image]

Changing rooms are the same all over the galaxy and this one really played to the stereotype. The lights flickered that little bit more than you’d want them to, a sizeable proportion of the lockers wouldn’t lock and the whole room needed a good clean. It didn’t fit with the eye-watering amount of money we had all paid for the tour.

There were a dozen or so of us changing from our normal clothes into outfits that had been supplied by the tour company — outfits that were supposed to render us invisible when we reached our destination. Not invisible in the “bending light rays around you” way, they would just make us look enough like the local inhabitants that no-one would give us a second glance.

Appropriate changing room etiquette was followed. Everyone was either looking at the floor or into their locker to avoid eye contact with anyone else. People talked in lowered voices to people they had come with. People who, like me, had come alone were silent. I picked up on some of the quiet conversations — they were about the unusual flora and fauna of our location and the unique event we were here to see.

Soon, we had all changed and were ushered into a briefing room where our guide told us many things we already knew. She had slides explaining the physics behind the phenomenon and was at great pains to emphasise the uniqueness of the event. No other planet in the galaxy had been found that met all of the conditions for what we were going to see. She went through the history of tourism to this planet — decades of uncontrolled visits followed by the licensing of a small number of carefully vetted companies like the one we were travelling with.

She then turned to more practical matters. She reiterated that our outfits would allow us to pass for locals, but that we should do all we could to avoid any interactions with the natives. She also reminded us that we should only look at the event through the equipment that we would be issued with on our way down to the planet.

Through a window in the briefing room a planet, our destination, hung in space. Beyond the planet, its star could also be seen.

An hour or so later, we were on the surface of the planet. We were deposited at the top of a grassy hill on the edge of a large crowd of the planet’s inhabitants. Most of us were of the same basic body shape as the quadruped locals and, at first glance at least, passed for them. A few of us were less lucky and had to stay in the vehicles to avoid suspicion.

The timing of the event was well understood and the company had dropped us off early enough that we were able to find a good viewing spot but late enough that we didn’t have long to wait. We had been milling around for half an hour or so when a palpable moment of excitement passed through the crowd and everyone looked to the sky.

Holding the equipment I had been given to my eyes I could see what everyone else had noticed. A small bite seemed to have been taken from the bottom left of the planet’s sun. As we watched, the bite got larger and larger as the planet’s satellite moved in front of the star. The satellite appeared to be a perfect circle, but at the last minute — just before it covered the star completely — it became obvious that the edge wasn’t smooth as gaps between irregularities on the surface (mountains, I suppose) allowed just a few points of light through.

And then the satellite covered the sun and the atmosphere changed completely. The world turned dark and all conversations stopped. All of the local animals went silent. It was magical.

My mind went back to the slides explaining the phenomenon. Obviously, the planet’s satellite and star weren’t the same size, but their distance from the planet exactly balanced their difference in size so they appeared the same size in the sky. And the complex interplay of orbits meant that on rare occasions like this, the satellite would completely and exactly cover the star.

That was what we were there for. This was what was unique about this planet. No other planet in the galaxy had a star and a satellite that appeared exactly the same size in the sky. This is what made the planet the most popular tourist spot in the galaxy.

Ten minutes later, it was over. The satellite continued on its path and the star was gradually uncovered. Our guide bundled us into the transport and back up to our spaceship.

Before leaving the vicinity of the planet, our pilot found three locations in space where the satellite and the star lined up in the same way and created fake eclipses for those of us who had missed taking photos of the real one.

Originally published at https://blog.dave.org.uk on April 7, 2024.

Changing rooms are the same all over the galaxy and this one really played to the stereotype. The lights flickered that little bit more than you’d want them to, a sizeable proportion of the lockers wouldn’t lock and the whole room needed a good clean. It didn’t fit with the eye-watering amount of money we had all paid for the tour.

There were a dozen or so of us changing from our normal clothes into outfits that had been supplied by the tour company – outfits that were supposed to render us invisible when we reached our destination. Not invisible in the “bending light rays around you” way, they would just make us look enough like the local inhabitants that no-one would give us a second glance.

Appropriate changing room etiquette was followed. Everyone was either looking at the floor or into their locker to avoid eye contact with anyone else. People talked in lowered voices to people they had come with. People who, like me, had come alone were silent. I picked up on some of the quiet conversations – they were about the unusual flora and fauna of our location and the unique event we were here to see.

Soon, we had all changed and were ushered into a briefing room where our guide told us many things we already knew. She had slides explaining the physics behind the phenomenon and was at great pains to emphasise the uniqueness of the event. No other planet in the galaxy had been found that met all of the conditions for what we were going to see. She went through the history of tourism to this planet – decades of uncontrolled visits followed by the licensing of a small number of carefully vetted companies like the one we were travelling with.

She then turned to more practical matters. She reiterated that our outfits would allow us to pass for locals, but that we should do all we could to avoid any interactions with the natives. She also reminded us that we should only look at the event through the equipment that we would be issued with on our way down to the planet.

Through a window in the briefing room a planet, our destination, hung in space. Beyond the planet, its star could also be seen.

An hour or so later, we were on the surface of the planet. We were deposited at the top of a grassy hill on the edge of a large crowd of the planet’s inhabitants. Most of us were of the same basic body shape as the quadruped locals and, at first glance at least, passed for them. A few of us were less lucky and had to stay in the vehicles to avoid suspicion.

The timing of the event was well understood and the company had dropped us off early enough that we were able to find a good viewing spot but late enough that we didn’t have long to wait. We had been milling around for half an hour or so when a palpable moment of excitement passed through the crowd and everyone looked to the sky.

Holding the equipment I had been given to my eyes I could see what everyone else had noticed. A small bite seemed to have been taken from the bottom left of the planet’s sun. As we watched, the bite got larger and larger as the planet’s satellite moved in front of the star. The satellite appeared to be a perfect circle, but at the last minute – just before it covered the star completely – it became obvious that the edge wasn’t smooth as gaps between irregularities on the surface (mountains, I suppose) allowed just a few points of light through.

And then the satellite covered the sun and the atmosphere changed completely. The world turned dark and all conversations stopped. All of the local animals went silent. It was magical.

My mind went back to the slides explaining the phenomenon. Obviously, the planet’s satellite and star weren’t the same size, but their distance from the planet exactly balanced their difference in size so they appeared the same size in the sky. And the complex interplay of orbits meant that on rare occasions like this, the satellite would completely and exactly cover the star.

That was what we were there for. This was what was unique about this planet. No other planet in the galaxy had a star and a satellite that appeared exactly the same size in the sky. This is what made the planet the most popular tourist spot in the galaxy.

Ten minutes later, it was over. The satellite continued on its path and the star was gradually uncovered. Our guide bundled us into the transport and back up to our spaceship.

Before leaving the vicinity of the planet, our pilot found three locations in space where the satellite and the star lined up in the same way and created fake eclipses for those of us who had missed taking photos of the real one.

The post The Tourist appeared first on Davblog.

I really thought that 2023 would be the year I got back into the swing of seeing gigs. But, somehow I ended up seeing even fewer than I did in 2022–12, when I saw 16 the previous year. Sometimes, I look at Martin’s monthly gig round-ups and wonder what I’m doing with my life!

I normally list my ten favourite gigs of the year, but it would be rude to miss just two gigs from the list, so here are all twelve gigs I saw this year — in, as always, chronological order.

  • John Grant (supported by The Faultress) at St. James’s Church
    John Grant has become one of those artists I try to see whenever they pass through London. And this was a particularly special night as he was playing an acoustic set in one of the most atmospheric venues in London. The evening was only slightly marred by the fact I arrived too late to get a decent seat and ended up not being able to see anything.
  • Hannah Peel at Kings Place
    Hannah Peel was the artist in residence at Kings Place for a few months during the year and played three gigs during that time. This was the first of them — where she played her recent album, Fir Wave, in its entirety. A very laid-back and thoroughly enjoyable evening.
  • Orbital at the Eventim Apollo
    I’ve been meaning to get around to seeing Orbital for many years. This show was originally planned to be at the Brixton Academy but as that venue is currently closed, it was relocated to Hammersmith. To be honest, this evening was slightly hampered by the fact I don’t know as much of their work as I thought I did and it was all a bit samey. I ended up leaving before the encore.
  • Duran Duran (supported by Jake Shears) at the O2 Arena
    Continuing my quest to see all of the bands I was listening to in the 80s (and, simultaneously, ticking off the one visit to the O2 that I allow myself each year). I really enjoyed the nostalgia of seeing Duran Duran but, to be honest, I think I enjoyed Jake Shears more — and it was the Scissor Sisters I was listening to on the way home.
  • Hannah Peel and Beibei Wang at Kings Place
    Even in a year where I only see a few gigs, I still manage to see artists more than once. This was the second of Hannah Peel’s artist-in-residence shows. She appeared with Chinese percussionist Beibei Wang in a performance that was completely spontaneous and unrehearsed. Honestly, some parts were more successful than others, but it was certainly an interesting experience.
  • Songs from Summerisle at the Barbican Hall
    The Wicker Man is one of my favourite films, so I jumped at the chance to see the songs from the soundtrack performed live. But unfortunately, the evening was a massive disappointment. The band sounded like they had met just before the show and, while they all obviously knew the songs, they hadn’t rehearsed them together. Maybe they were going for a rustic feel — but, to me, it just sounded unprofessional.
  • Belle and Sebastian at the Roundhouse
    Another act that I try to see as often as possible. I know some people see Belle and Sebastian as the most Guardian-reader band ever — but I love them. This show saw them on top form.
  • Jon Anderson and the Paul Green Rock Academy at the Shepherds Bush Empire
    I’ve seen Yes play live a few times in the last ten years or so and, to be honest, it can sometimes be a bit over-serious and dull. In this show, Jon Anderson sang a load of old Yes songs with a group of teenagers from the Paul Green Rock Academy (the school that School of Rock was based on) and honestly, the teenagers brought such a feeling of fun to the occasion that it was probably the best Yes-related show that I’ve seen.
  • John Grant and Richard Hawley at the Barbican Hall
    Another repeated act — my second time seeing John Grant in a year. This was something different as he was playing a selection of Patsy Cline songs. I don’t listen to Patsy Cline much, but I knew a few more of the songs than I expected to. This was a bit lower-key than I was expecting.
  • Peter Hook and the Light at the Eventim Apollo
    I’ve been planning to see Peter Hook and the Light for a couple of years. There was a show I had tickets for in 2020, but it was postponed because of COVID and when it was rescheduled, I was unable to go, so I cancelled my ticket and got a refund. So I was pleased to get another chance. And this show had them playing both of the Substance albums (Joy Division and New Order). I know New Order still play some Joy Division songs in their sets, but this is probably the best chance I’ll have to see some deep Joy Division cuts played live. I really enjoyed this show.
  • Heaven 17 at the Shepherds Bush Empire
    It seems I see Heaven 17 live most years and they usually appear on my “best of” lists. This show was celebrating the fortieth anniversary of their album The Luxury Gap — so that got played in full, alongside many other Heaven 17 and Human League songs. A thoroughly enjoyable night.
  • The Imagined Village and Afro-Celt Sound System at the Roundhouse
    I’ve seen both The Imagined Village and the Afro-Celts live once before. And they were two of the best gigs I’ve ever seen. I pretty much assumed that the death of Simon Emmerson (who was an integral part of both bands) earlier in 2023 would mean that both bands would stop performing. But this show was a tribute to Emmerson and the bands both reformed to celebrate his work. This was probably my favourite gig of the year. That’s The Imagined Village (featuring two Carthys, dour Coppers and Billy Bragg) in the photo at the top of this post.

So, what’s going to happen in 2024. I wonder if I’ll get back into the habit of going to more shows. I only have a ticket for one gig next year — They Might Be Giants playing Flood in November (a show that was postponed from this year). I guess we’ll see. Tune in this time next year to see what happened.

Originally published at https://blog.dave.org.uk on December 31, 2023.

I really thought that 2023 would be the year I got back into the swing of seeing gigs. But, somehow I ended up seeing even fewer than I did in 2022 – 12, when I saw 16 the previous year. Sometimes, I look at Martin’s monthly gig round-ups and wonder what I’m doing with my life!

I normally list my ten favourite gigs of the year, but it would be rude to miss just two gigs from the list, so here are all twelve gigs I saw this year – in, as always, chronological order.

  • John Grant (supported by The Faultress) at St. James’s Church
    John Grant has become one of those artists I try to see whenever they pass through London. And this was a particularly special night as he was playing an acoustic set in one of the most atmospheric venues in London. The evening was only slightly marred by the fact I arrived too late to get a decent seat and ended up not being able to see anything.
  • Hannah Peel at Kings Place
    Hannah Peel was the artist in residence at Kings Place for a few months during the year and played three gigs during that time. This was the first of them – where she played her recent album, Fir Wave, in its entirety. A very laid-back and thoroughly enjoyable evening.
  • Orbital at the Eventim Apollo
    I’ve been meaning to get around to seeing Orbital for many years. This show was originally planned to be at the Brixton Academy but as that venue is currently closed, it was relocated to Hammersmith. To be honest, this evening was slightly hampered by the fact I don’t know as much of their work as I thought I did and it was all a bit samey. I ended up leaving before the encore.
  • Duran Duran (supported by Jake Shears) at the O2 Arena
    Continuing my quest to see all of the bands I was listening to in the 80s (and, simultaneously, ticking off the one visit to the O2 that I allow myself each year). I really enjoyed the nostalgia of seeing Duran Duran but, to be honest, I think I enjoyed Jake Shears more – and it was the Scissor Sisters I was listening to on the way home.
  • Hannah Peel and Beibei Wang at Kings Place
    Even in a year where I only see a few gigs, I still manage to see artists more than once. This was the second of Hannah Peel’s artist-in-residence shows. She appeared with Chinese percussionist Beibei Wang in a performance that was completely spontaneous and unrehearsed. Honestly, some parts were more successful than others, but it was certainly an interesting experience.
  • Songs from Summerisle at the Barbican Hall
    The Wicker Man is one of my favourite films, so I jumped at the chance to see the songs from the soundtrack performed live. But unfortunately, the evening was a massive disappointment. The band sounded like they had met just before the show and, while they all obviously knew the songs, they hadn’t rehearsed them together. Maybe they were going for a rustic feel – but, to me, it just sounded unprofessional.
  • Belle and Sebastian at the Roundhouse
    Another act that I try to see as often as possible. I know some people see Belle and Sebastian as the most Guardian-reader band ever – but I love them. This show saw them on top form.
  • Jon Anderson and the Paul Green Rock Academy at the Shepherds Bush Empire
    I’ve seen Yes play live a few times in the last ten years or so and, to be honest, it can sometimes be a bit over-serious and dull. In this show, Jon Anderson sang a load of old Yes songs with a group of teenagers from the Paul Green Rock Academy (the school that School of Rock was based on) and honestly, the teenagers brought such a feeling of fun to the occasion that it was probably the best Yes-related show that I’ve seen.
  • John Grant and Richard Hawley at the Barbican Hall
    Another repeated act – my second time seeing John Grant in a year. This was something different as he was playing a selection of Patsy Cline songs. I don’t listen to Patsy Cline much, but I knew a few more of the songs than I expected to. This was a bit lower-key than I was expecting.
  • Peter Hook and the Light at the Eventim Apollo
    I’ve been planning to see Peter Hook and the Light for a couple of years. There was a show I had tickets for in 2020, but it was postponed because of COVID and when it was rescheduled, I was unable to go, so I cancelled my ticket and got a refund. So I was pleased to get another chance. And this show had them playing both of the Substance albums (Joy Division and New Order). I know New Order still play some Joy Division songs in their sets, but this is probably the best chance I’ll have to see some deep Joy Division cuts played live. I really enjoyed this show.
  • Heaven 17 at the Shepherds Bush Empire
    It seems I see Heaven 17 live most years and they usually appear on my “best of” lists. This show was celebrating the fortieth anniversary of their album The Luxury Gap – so that got played in full, alongside many other Heaven 17 and Human League songs. A thoroughly enjoyable night.
  • The Imagined Village and Afro-Celt Sound System at the Roundhouse
    I’ve seen both The Imagined Village and the Afro-Celts live once before. And they were two of the best gigs I’ve ever seen. I pretty much assumed that the death of Simon Emmerson (who was an integral part of both bands) earlier in 2023 would mean that both bands would stop performing. But this show was a tribute to Emmerson and the bands both reformed to celebrate his work. This was probably my favourite gig of the year. That’s The Imagined Village (featuring two Carthys, dour Coppers and Billy Bragg) in the photo at the top of this post.

So, what’s going to happen in 2024. I wonder if I’ll get back into the habit of going to more shows. I only have a ticket for one gig next year – They Might Be Giants playing Flood in November (a show that was postponed from this year). I guess we’ll see. Tune in this time next year to see what happened.

The post 2023 in Gigs appeared first on Davblog.

Seventy Years of Change

Her Majesty has, of course, seen changes in many areas of society in the seventy years of her reign. But here, we’re most interested in the line of succession. So we thought it would be interesting to look at the line of succession on the day that she took the throne and see what had happened to the people who were at the top of the line of succession on that day. It’s a very different list to today’s.

  1. The Prince Charles, Duke of Cornwall
    We start with the one person who is in exactly the same place as he was seventy years ago. Prince Charles was three years old and hadn’t yet been made Prince of Wales.
  2. The Princess Anne
    Princess Anne has fallen a long way in seventy years. The birth of younger brothers (back in the days when sex mattered in the line of succession) and those brothers having families of their own mean that she is now down at number 17.
  3. Princess Margaret
    We’ve run out of the Queen’s descendants after only two places (today, they fill the top 24 places in the line) so we move to her sister. Princess Margaret had fallen to 11th place before her death in 2002.
  4. Prince Henry, Duke of Gloucester
    We’ve now run out of descendants of George VI, so we need to look at his brothers. This is the father of the current duke. He fell to 8th place before dying in 1974.
  5. Prince William of Gloucester
    The Duke of Gloucester’s eldest son had fallen to position 9 before sadly dying before his father in 1972.
  6. Prince Richard of Gloucester
    As his eldest son predeceased their father, it was Prince Richard who became Duke of Gloucester when the first duke died in 1974. He is currently in 30th place.
  7. Prince Edward, Duke of Kent
    The first Duke of Kent had died ten years earlier, so it was his son, Prince Edward, who held the title, at the age of 16, who was duke in 1952, He fell out of the top 30 in 2012.
  8. Prince Michael of Kent
    Prince Michael had fallen to 16th place before his marriage to a Catholic, in 1978, excluded him from the line of succession. He was reinstated in 2015 (because the Succession to the Crown Act meant that marriage to a Catholic was no longer a reason for exclusion) but he reappeared outside of the top 30.
  9. Princess Alexandra of Kent
    Princess Alexandra had dropped down the list pretty consistently throughout her life. From 1999 she popped in and out of the top 30 a few times. but she left it for the last time in 2003.
  10. Princess Mary, Princess Royal
    The youngest child and only daughter of George V, Princess Mary had called to 17th in line before she died in 1965.
  11. George Lascelles, The 7th Earl of Harewood
    Fell out of the top 30 in 1994 before dying in 2011.
  12. David Lascelles, Viscount Lascelles
    Fell out of the top 30 in 1993.
  13. Gerald Lascelles
    Fell out of the top 30 in 1982 and died in 1998.
  14. Princess Arthur of Connaught, Duchess of Fife
    Fell to 17th before dying in 1959
  15. James Carnegie, 3rd Duke of Fife
    Fell out of the top 30 in 1981 and died in 2015
  16. Olaf V, King of Norway
    A bit of a leap as we find the royal family of Norway surprisingly close to the top of the list. King Olaf was a grandson of Edward VII (through Edward’s daughter Maud). He fell out of the top 30 in 1979 and died in 1991.
  17. Prince Harald of Norway
    Prince Harald became king of Norway in 1991. He fell out of the top 30 of the British line of succession in 1977.
  18. Princess Ragnhild of Norway
    Princess Ragnhild fell out of the top 30 in 1973 and died in 2012.
  19. Princess Astrid of Norway
    Princess Astrid fell out of the top 30 in 1964.
  20. Carol II of Romania
    The next-closest royal family to ours is the Romanians. Carol II was a great-grandson of Victoria. The death of George VI moved him up a place from 21 to 20 and he remained there until his death the following year. Carol hadn’t actually been King of Romania since he was forced to abdicate in 1940.
  21. Carol Lambrino
    The question of Carol Lambino’s legitimacy is a question of some dispute — so he may not have been on the line of succession at all. But, if he was, he fell out of the top 30 in 1963 and died in 2006.
  22. Paul-Philippe Hohenzollern
    As son of the possibly-illegitimate Carol Lambino, Paul-Phillippe’s place in the line of succession is also in question. But, anyway, he fell out of the top 30 in 1962.
  23. Prince Nicholas of Romania
    Prince Nicholas fell out of the top 30 in 1961 and died in 1978.
  24. Elisabeth of Romania
    Fell to number 27 before dying in 1956.
  25. Maria of Yugoslavia
    Fell to position 30 before dying in 1961.
  26. Peter II of Yugoslavia
    Peter was no longer King of Yugoslavia, having been deposed in 1945. He fell out of the top 30 in 1961 and died in 1970.
  27. Prince Tomislav of Yugoslavia
    Fell out of the top 30 in 1960 and died in 2000.
  28. Prince Andrew of Yugoslavia
    Fell out of the top 30 in 1959 and died in 1990.
  29. Princess Ileana of Romania
    Fell out of the top 30 in 1954 and died in 1991.
  30. Archduke Stefan of Austria
    Fell out of the top 30 in 1953 and died in 1998.

I think that’s an interesting list for a few reasons:

  • The fact that we’ve gone from two of the Queen’s descendants to twenty-four of them on the list (but even that’s not as big a difference as happened during Victoria’s reign).
  • Only ten of the people on the list are still living.
  • There’s a large number of foreign royalty on the list — basically, the second half of the list is taken up by members of the royal families of Norway, Romania and Yugoslavia. This is obviously because of the way that royal families inter-married up until early in the 20th century. We see far less of that now.

So what do you think? Was the 1952 list a surprise to you? Did you expect it to be as different as it is from the current list?

Originally published at https://blog.lineofsuccession.co.uk on February 7, 2022.


Seventy Years of Change — Line of Succession Blog was originally published in Line of Succession on Medium, where people are continuing the conversation by highlighting and responding to this story.

Yesterday’s coronation showed Britain doing what Britain does best — putting on the most gloriously bonkers ceremony the world has seen…

Ratio: The Simple Codes Behind the Craft of Everyday Cooking (1) (Ruhlman's Ratios)
author: Michael Ruhlman
name: David
average rating: 4.06
book published: 2009
rating: 0
read at:
date added: 2023/02/06
shelves: currently-reading
review:

Rather later than usual (again!) here is my review of the best ten gigs I saw in 2022. For the first time since 2019, I did actually see more than ten gigs in 2022 although my total of sixteen falls well short of my pre-pandemic years.

Here are my ten favourite gigs of the year. As always, they’re in chronological order.

  • Pale Waves at the Roundhouse
    I’ve seen Pale Waves a few times now and I think they’ve firmly established their place on my “see them whenever they tour near me” list. This show was every bit as good as I’ve ever seen them.
  • Orchestral Manoeuvres in the Dark at the Royal Albert Hall
    Another band I see whenever I can. This was a slightly different set where the first half was called “Atmospheric” and concentrated on some deeper cuts from their back catalogue and the second half included all the hits.
  • Chvrches at Brixton Academy
    In 2020, I moved to a flat that’s about fifteen minutes’ walk from Brixton Academy. But I had to wait about eighteen months in order to take advantage of that fact. The last couple of times I’ve seen Chvrches were at Alexandra Palace, so it was nice to see them at a smaller venue again. This show featured a not-entirely unexpected guest appearance from Robert Smith.
  • Sunflower Bean at Electric Ballroom
    Another act who I see live as often as I can. And this was a great venue to see them in.
  • Pet Shop Boys at the O2 Arena
    There’s always one show a year that draws me to the soulless barn that is the O2 Arena. Every time I go there, I vow it’ll be the last time – but something always pulls me back. This year it was the chance to see a band I loved in the 80s and have never seen live. This was a fabulous greatest hits show that had been postponed from 2020.
  • Lorde at the Roundhouse
    A new Lorde album means another Lorde tour. And, like Chvrches, she swapped the huge expanse of Alexandra Palace for multiple nights at a smaller venue. This was a very theatrical show that matched the vibe of the Solar Power album really well.
  • LCD Soundsystem at Brixton Academy
    Another show at Brixton Academy. For some reason, I didn’t know about this show until I walked past the venue a few days before and saw the “sold out” signs. But a day or so later, I got an email from the venue offering tickets. So I snapped one up and had an amazing evening. It was the first time I’d seen them, but I strongly suspect it won’t be the last. That’s them in the photo at the top of this post.
  • Roxy Music at the O2 Arena
    Some years there are two shows that force me to the O2 Arena. And this was one of those years. I’ve been a fan of Roxy Music since the 70s but I’ve never seen them live. Honestly, it would have been better to have seen them in the 70s or 80s, but it was still a great show.
  • Beabadoobee at Brixton Academy
    Sometimes you go to see an artist because of one song and it just works out. This was one of those nights. In fact, it turns out I didn’t actually know “Coffee For Your Head” very well – I just knew the sample that was used in another artist’s record. But this was a great night and I hope to see her again very soon.
  • Sugababes at Eventim Apollo
    Another night of fabulous nostalgia. The Eventim Apollo seems to have become my venue of choice to see re-formed girl groups from the 80s and 90s – having seen Bananarama, All Saints and now The Sugababes there in recent years. They have a surprising number of hits (far more than I remembered before the show) and they put on a great show.

Not everything could make the top ten though. I think was the first year that I saw Stealing Sheep and they didn’t make the list (their stage shows just get weirder and weirder and the Moth Club wasn’t a great venue for it) and I was astonished to find myself slightly bored at the Nine Inch Nails show at Brixton Academy.

A few shows sit just outside of the top ten – St. Vincent at the Eventim Apollo, John Grant at the Shepherd’s Bush Empire and Damon Albarn at the Barbican spring to mind.

But, all in all, it was a good year for live music and I’m looking forward to seeing more than sixteen shows this year.

Did you see any great shows this year? Tell us about them in the comments.

The post 2022 in Gigs appeared first on Davblog.

Dave Cross posted a photo:

Goodbye Vivienne

via Instagram instagr.am/p/CmyT_MSNR3-/

Dave Cross posted a photo:

Low sun on Clapham Common this morning

via Instagram instagr.am/p/Cmv4y1eNiPn/

Dave Cross posted a photo:

There are about a dozen parakeets in this tree. I can hear them and (occasionally) see them

via Instagram instagr.am/p/Cmv4rUAta58/

Dave Cross posted a photo:

Sunrise on Clapham Common

via Instagram instagr.am/p/Cmq759NtKtE/

Dave Cross posted a photo:

Brixton Academy

via Instagram instagr.am/p/CmOfgfLtwL_/

Using artificial intelligence (AI) to generate blog posts can be bad for search engine optimization (SEO) for several reasons.

First and foremost, AI-generated content is often low quality and lacks the depth and substance that search engines look for when ranking content. Because AI algorithms are not capable of understanding the nuances and complexities of human language, the content they produce is often generic, repetitive, and lacks originality. This can make it difficult for search engines to understand the context and relevance of the content, which can negatively impact its ranking.

Additionally, AI-generated content is often not well-written or structured, which can make it difficult for readers to understand and engage with. This can lead to a high bounce rate (the percentage of visitors who leave a website after only viewing one page), which can also hurt the website’s ranking.

Furthermore, AI-generated content is often not aligned with the website’s overall content strategy and goals. Because AI algorithms are not capable of understanding the website’s target audience, brand voice, and core messaging, the content they produce may not be relevant or useful to the website’s visitors. This can lead to a poor user experience, which can also hurt the website’s ranking.

Another issue with AI-generated content is that it can be seen as spammy or low quality by both search engines and readers. Because AI-generated content is often produced in large quantities and lacks originality, it can be seen as an attempt to manipulate search engine rankings or trick readers into engaging with the website. This can lead to the website being penalized by search engines or losing the trust and loyalty of its visitors.

In conclusion, using AI to generate blog posts can be bad for SEO for several reasons. AI-generated content is often low quality, poorly written, and not aligned with the website’s content strategy. It can also be seen as spammy or low quality by both search engines and readers, which can hurt the website’s ranking and reputation. It is important for websites to prioritize creating high-quality, original, and relevant content to improve their SEO and provide a positive user experience.

[This post was generated using ChatGPT]

The post 5 Reasons Why Using AI to Generate Blog Posts Can Destroy Your SEO appeared first on Davblog.

‘Okay Google. Where is Antarctica?”

Children can now get answers to all their questions using smart speakers and digital voice assistants.

A few years ago, children would run to their parents or grandparents to answer their questions. But with the ascendence of voice assistants to the mainstream in recent years, many children rely more on technology than humans.

Is this a good idea?

How does it impact the children?

When children interact with people, it helps them be more thoughtful, creative, and imaginative.

When they use artificial intelligence instead, several issues come into the foreground. These include access to age-inappropriate content and increasing the possibility of being rude or unpleasant, affecting how they treat others.

As mentioned, technology has both pros and cons. There are benefits to children using these devices, including improving diction, communication, social skills, and gaining information without bothering their parents.

Many families find that smart speakers like Amazon Echo and Google Home are useful. They use them for several functions, ranging from answering questions to setting the thermostat. Research shows that up to nine out of ten children between the ages of four and eleven in the US are regularly using smart speakers — often without parental guidance and control. So, what is the best approach for a parent to take?

Children up to seven years old can find it challenging to differentiate between humans and devices, and this can lead to one of the biggest dangers. If the device fulfils their requests through rude behaviour, children may behave similarly to other humans.

Do Parents Think Smart Devices Should Encourage Polite Conversations?

Most parents consider it essential that smart devices should encourage polite conversations as a part of nurturing good habits in children. The Campaign for a Commercial-Free Childhood or CCFA is a US coalition of concerned parents, healthcare professionals, and educators. Recently, CCFA protested against Amazon Echo Dot Kids Edition, stating that it may affect children’s wellbeing. Because of this, they requested parents avoid buying Amazon Echo.

However, in reality, these smart devices have improved a lot and focus on encouraging polite conversations with children. It is all about how parents use and present these devices to their children, as these factors can influence them a lot.

But in simple terms, parents wish these devices to encourage politeness in their children. At the same time, they want their kids to understand the difference between artificial intelligence and humans while using these technological innovations.

Do Parents Think Their Children are Less Polite While Using Smart Speakers?

Many parents have seen their children behave rudely to smart speakers. Several parents have expressed their concerns through social media, blog posts and forums like Mumsnet. They fear these behaviours can impact their kids when they grow up.

A report published in Child Wise reached the conclusion that children who behave rudely to smart devices might be aggressive while they grow up, especially while dealing with other humans. It is, therefore, preferable if children use polite words while interacting with both humans and smart devices.

What Approaches Have Been Taken By Tech Companies to Address the Problem?

With interventions and rising concerns addressed by parents and health professionals, some tech companies have brought changes to virtual assistants and smart speakers.

The parental control features available in Alexa focus on training kids to be more polite. Amazon brands it as Magic Word, where the focus is on bringing positive enforcement. However, there is no penalty if children don’t speak politely. Available on Amazon Echo, this tool has added features like setting bedtimes, switching off devices, and blocking songs with explicit lyrics.

When it comes to Google Home, it has brought in a new feature called Pretty Please. Here, Google will perform an action only when children use, please. For instance, “Okay, Google. Please set the timer for 15 minutes.”

You can enable this feature through the Google Family Link, where you can find the settings for Home and Assistant. You can set these new standards for devices of your preference. Also, once you use it and figure things out, there will be no more issues in setting it up again.

These tools and their approaches are highly beneficial for kids and parents. As of now, these devices only offer basic features and limited replies. But with time, there could be technological changes that encourage children to have much more efficient and polite interactions.

George and the Smart Home

It was thinking about issues like this which led me to write my first children’s book — George and the Smart Home. In the book, George is a young boy who has problems getting the smart speakers in his house to do what he wants until he learns to be polite to them.

It is available now, as a paperback and a Kindle book, from Amazon.

Buy it from: AU / BR / CA / DE / ES / FR / IN / IT / JP / MX / NL / UK / US

The post Should Children be Polite While Using Smart Speakers? appeared first on Davblog.

S.

S.
author: J.J. Abrams
name: David
average rating: 3.86
book published: 2013
rating: 0
read at:
date added: 2022/01/16
shelves: currently-reading
review:

The Introvert Entrepreneur
author: Beth Buelow
name: David
average rating: 3.43
book published: 2015
rating: 0
read at:
date added: 2020/01/27
shelves: currently-reading
review:


Some thoughts on ways to measure the quality of Perl code (and, hence, get a basis for improving it)

How (and why) I spent 90 minutes writing a Twitterbot that tweeted the Apollo 11 mission timeline (shifted by 50 years)

A talk from the European Perl Conference 2019 (but not about Perl)
Prawn Cocktail Years
author: Lindsey Bareham
name: David
average rating: 4.50
book published: 1999
rating: 0
read at:
date added: 2019/07/29
shelves: currently-reading
review:

Write. Publish. Repeat. (The No-Luck-Required Guide to Self-Publishing Success)
author: Sean Platt
name: David
average rating: 4.28
book published: 2013
rating: 0
read at:
date added: 2019/06/24
shelves: currently-reading
review:


The slides from a half-day workshop on career development for programmers that I ran at The Perl Conference in Glasgow

A (not entirely serious) talk that I gave at the London Perl Mongers technical meeting in March 2018. It talks about how and why I build a web site listing the line of succession to the British throne back through history.
Dave Cross / Tuesday 15 October 2024 18:01