I can’t be the only programmer who does this. You’re looking for an online service to fill some need in your life. You look at three or four competing products and they all get close but none of them do everything you want. Or maybe they do tick all the boxes but they cost that little bit more than you’re comfortable paying. After spending a few hours on your search that little voice pops up in your head with that phrase that you really don’t want to hear:
Maybe you should just write your own version. How hard can it be?
A couple of hours later, you have something that vaguely works, you’ve learned more than you thought there was to learn about some obscure corner of life and you’re the proud owner of another new domain.
Please tell me it’s not just me.
So today I’ve been working on my Linktree clone.
Honestly, I can’t remember what it was about Linktree or its existing clones that I didn’t like. I suspect it’s that I just wanted more control over my links page than a hosted service would give me. All I can be sure of is that in September 2022 I made the first commit to a project that, eighteen months later, I’m still maintaining and improving.
To be fair to myself, I didn’t buy a new domain. That means I’m getting better, right? The output is hosted at links.davecross.co.uk. I’m not even paying for hosting as it’s all hosted on GitHub Pages – it’s a static site that has occasional changes, so it’s perfect for GitHub Pages.
But I have spent quite a lot of time working on the code. Probably more than is reasonable for a web site that gets a dozen visits in a good month. Work on it seems to come in waves. I’ll go for months without touching it, and then I’ll spend a week or so working on it pretty much every day. Over the last 24 hours or so, I’ve passed an important milestone. Like all of these little side projects, this one started out as a largely unstructured code dump – as I worked to get it doing something that approximated the original goal. Then I’ll spend some time (months, usually) where fixes and improvements are implemented by hacking on the original horrible code. At some point. I’ll realise that I’m making things too difficult for myself and I’ll rewrite it (largely from scratch) to be better structured and easier to maintain. That’s where I got to today. The original single-file code dump has been rewritten into something that’s far nicer to work on. And as a side benefit, I’ve rewritten it all using Perl’s new, built-in object orientation features – which I’m loving.
Oh, and I guess that’s the upside of having little side projects like this – I get to try out new features like the new OO stuff in a no-pressure environment. And just spending time doing more programming has to make you a better programmer, right? And surely it’s just a matter of time before one of these projects takes off and turns me into a millionaire! I’m not saying for a minute that having pointless side projects is a bad idea. I’m just wondering how many pointless side projects are too many
So, that’s my guilty secret – I’m a serial writer of code that doesn’t really need to be written. What about you? How many pointless side projects do you have? And how much of your spare time do they use up?
The post Pointless personal side projects appeared first on Perl Hacks.
I can’t be the only programmer who does this. You’re looking for an online service to fill some need in your life. You look at three or four competing products and they all get close but none of them do everything you want. Or maybe they do tick all the boxes but they cost that […]
The post Pointless personal side projects appeared first on Perl Hacks.
The future is already here – it’s just not very evenly distributed
– William Gibson
The quotation above was used by Tim O’Reilly a lot around the time that Web 2.0 got going. Over recent months, I’ve had a few experiences that have made it clear to me that even the present isn’t particularly evenly distributed either. It’s always easy to find people still using technologies that we would consider archaic (and not in a rustic or hipster way).
We’ve known for twenty years that CGI is a bad idea. It’s almost ten years since CGI.pm was removed from Perl core. Surely, all of us are using something modern for web development these days.
Well, apparently not. CGI is alive and well and living on the fringes of the Perl community. I’ve come across it being used in some quite surprising places over the last year or so. I’m going to obfuscate some details in the following descriptions to, hopefully, prevent you (or, worse, the people involved) from recognising the companies involved.
None of this should be taken as an argument that the nms project was wrong to use CGI.pm or that the Perl 5 Porters were wrong to remove it from the Perl standard library. I still support both decisions. I just found it a bit jarring to be reminded that while we’re all using PSGI or Mojolicious to write microservices in Perl that serve REST APIs that are developed and deployed in Docker containers, there are still people out there who are struggling to FTP code that was written in 1997 onto low-end shared hosting.
I think this state of affairs has two causes. Firstly (like the first client I mentioned above) some systems were set up when CGI was still in common use – and things haven’t changed since. These people get a sudden shock when they are forced to move to a more modern server for some reason. And then there are people like my Fiverr clients who install Perl CGI programs because that’s what they have always done and they don’t know that there is an alternative approach. Part of the problem there is, presumably, that Perl has meant badly-written CGI programs for a large proportion of the web’s existence and means anyone searching for information on this subject is likely to find pages and pages of advice telling them how to install CGI programs before they discover anything about PSGI or Docker. And I think there might be a solution to that problem (or, at least, a way to nudge the web in the right direction).
Over last weekend I was cataloguing subdomains (I know how to have fun!) and I found a web site that I had forgotten about. I had obviously been contemplating a very similar situation back in 2016.
The site is called Perl Web Advice. The intention was (is?) that it would be a definitive source of good advice about how to develop and deploy web applications written in Perl. I had only made tiny inroads into the task before something else apparently seemed more fun and the project was abandoned.
But there’s the start of a framework for the site. And, this week, I’ve given it a GitHub Actions workflow so it gets republished automatically whenever changes are pushed to the repo. I’ve even set up a Dockerfile to make it easy to use the static site generator that I’ve used for it. So perhaps the idea has merit. Once there’s a bit more useful content there I could see if I can remember any of my SEO knowledge and get it appearing in results where people are looking for advice on this topic.
I would, of course, be happy to consider contributions from other people. What do you think? Would you like to help me save people from the hell of CGI deployments?
The post The present isn’t evenly distributed either appeared first on Perl Hacks.
The future is already here – it’s just not very evenly distributed – William Gibson The quotation above was used by Tim O’Reilly a lot around the time that Web 2.0 got going. Over recent months, I’ve had a few experiences that have made it clear to me that even the present isn’t particularly evenly […]
The post The present isn’t evenly distributed either appeared first on Perl Hacks.
You might remember that I’ve been taking an interest in GitHub Actions for the last year or so (I even wrote a book on the subject). And at the Perl Conference in Toronto last summer I gave a talk called “GitHub Actions for Perl Development” (here are the slides and the video).
During that talk, I mentioned a project I was working on to produce a set of reusable workflows that would make it easier for anyone to start using GitHub Actions in their Perl development. Although (as I said in the talk) things were moving pretty quickly on the project at the time, once I got back to London, several other things became more important and work on this project pretty much stalled. But over the last couple of weeks, I’ve returned to this project and I’ve finally got some of the workflows into a state where I’ve been using them successfully in my GitHub repos and I think they’re now ready for you to start using them in yours. There are three workflows that I’d like you to try:
And using these workflows in your GitHub repos is as simple as creating a new file in the .github/workflows directory which contains something like this:
name: CI on: push: branches: [ master ] pull_request: branches: [ master ] workflow_dispatch: jobs: build: uses: PerlToolsTeam/github_workflows/.github/workflows/cpan-test.yml@main coverage: uses: PerlToolsTeam/github_workflows/.github/workflows/cpan-coverage.yml@main perlcritic: uses: PerlToolsTeam/github_workflows/.github/workflows/cpan-perlcritic.yml@main
There are a couple of parameters you can use to change the behaviour of these workflows. In the Toronto talk, I introduced the idea of a matrix of tests, where you can test against three operating systems (Linux, MacOS and Windows) and a list of Perl versions. By default, the cpan-test workflow uses all three operating systems and all production versions of Perl from 5.24 to 5.38. But you can change that by using the perl_version and os parameters. For example, if you only wanted to test on Ubuntu, using the most recent two versions of Perl, you could use this:
build: uses: PerlToolsTeam/github_workflows/.github/workflows/cpan-test.yml@main with: perl_version: "['5.36', '5.38']" os: "['ubuntu']"
Annoyingly, the parameters to a reusable workflow can only be a single scalar value. That’s why we have to use a JSON-encoded string representing an array of values. Maybe this will get better in the future.
The cpan-perlcritic workflow also has a parameter. You can use level to change the level that perlcritic runs at. The default is 5 (the gentlest level) but if you were feeling particularly masochistic, you could do this:
perlcritic: uses: PerlToolsTeam/github_workflows/.github/workflows/cpan-perlcritic.yml@main with: level: 1
The workflows are, of course, available on GitHub. It would be great to have some people trying them out and reporting back on their experiences. Raising issues and sending pull requests is very much encouraged.
Please let me know how you get on with them.
The post GitHub Actions for Perl Development appeared first on Perl Hacks.
You might remember that I’ve been taking an interest in GitHub Actions for the last year or so (I even wrote a book on the subject). And at the Perl Conference in Toronto last summer I gave a talk called “GitHub Actions for Perl Development” (here are the slides and the video). During that talk, […]
The post GitHub Actions for Perl Development appeared first on Perl Hacks.
I really thought that 2023 would be the year I got back into the swing of seeing gigs. But, somehow I ended up seeing even fewer than I did in 2022 – 12, when I saw 16 the previous year. Sometimes, I look at Martin’s monthly gig round-ups and wonder what I’m doing with my… Continue reading 2023 in Gigs
The post 2023 in Gigs appeared first on Davblog.
I’ve mentioned before how much I enjoyed Olaf Alders’ talk, Whither Perl, at the Perl and Raku Conference in Toronto last month. I think it’s well worth spending forty minutes watching it. It triggered a few ideas that I’ll be writing about over the coming weeks and, today, I wanted to start by talking briefly about the idea of GitHub Organisations and introducing a couple of organisations that I’ve recently set up.
Olaf talks about GitHub Organisations as a way to ensure the continuity of your projects. And I think that’s a very important point. I’ve written a few times over recent years about the problems of getting fixes into CPAN modules when the maintainer has gone missing. This is also true of non-CPAN projects that members of the community might find useful. By setting up an organisation and inviting a few trusted people to join it, you can share the responsibility of looking after your projects. So if you lose interest or drift away, there’s a better chance that someone else will be able to take up the reins.
It’s actually a thought that I had before going to Toronto. A couple of months ago, I set up the Perl Tools Team organisation and transferred ownership of four of my repos.
There will, no doubt, be other repos that I’ll want to transfer over to this organisation in time. And, of course, if you have a tool that is used by the Perl community, I’ll be happy to add it to the list. Just get in touch and we can talk about it.
The other organisation (and I set this up just this morning) now owns all of the repos for my CPAN modules. I won’t be adding anybody else’s repos to this organisation, but if you send PRs to any of these projects (and I’m looking at you, Mohammad) then don’t be surprised if you get added to the organisation too! If you’ve watched my talk on GitHub Actions for Perl Development, then you might remember that I was developing a GitHub Workflow definition for releasing modules to CPAN. That’s still a work in progress, but now I’m thinking that I could add my PAUSE credentials to the GitHub secrets store for this organisation and the GitHub workflow could release the code to CPAN using my credentials without my input (but, obviously, I’m still considering the security implications of that – it would certainly only ever be available to me and a few trusted lieutenants).
This is still all very new and it will definitely develop in several directions over the coming months. But it feels like the a move in the right direction. After twenty years of CPAN releases, it feels like I’m turning my work into a “proper” project. And, hopefully, it can serve as a template that other people can follow. I’ll let you know how it goes.
So, what do you think? Is this the right model for CPAN development (and, also, Perl infrastructure development) moving forward? Would you be interested in joining either of these organisations? Do you have any tools that the Perl Tools Team could maintain for you?
The post GitHub Organisations appeared first on Perl Hacks.
I’ve mentioned before how much I enjoyed Olaf Alders’ talk, Whither Perl, at the Perl and Raku Conference in Toronto last month. I think it’s well worth spending forty minutes watching it. It triggered a few ideas that I’ll be writing about over the coming weeks and, today, I wanted to start by talking briefly […]
The post GitHub Organisations appeared first on Perl Hacks.
I’ve mentioned before how much I enjoyed Olaf Alders’ talk, Whither Perl, at the Perl and Raku Conference in Toronto last month. I think it’s well worth spending forty minutes watching it. It triggered a few ideas that I’ll be writing about over the coming weeks and, today, I wanted to start by talking briefly about the idea of GitHub Organisations and introducing a couple of organisations that I’ve recently set up.
Olaf talks about GitHub Organisations as a way to ensure the continuity of your projects. And I think that’s a very important point. I’ve written a few times over recent years about the problems of getting fixes into CPAN modules when the maintainer has gone missing. This is also true of non-CPAN projects that members of the community might find useful. By setting up an organisation and inviting a few trusted people to join it, you can share the responsibility of looking after your projects. So if you lose interest or drift away, there’s a better chance that someone else will be able to take up the reins.
It’s actually a thought that I had before going to Toronto. A couple of months ago, I set up the Perl Tools Team organisation and transferred ownership of four of my repos.
There will, no doubt, be other repos that I’ll want to transfer over to this organisation in time. And, of course, if you have a tool that is used by the Perl community, I’ll be happy to add it to the list. Just get in touch and we can talk about it.
The other organisation (and I set this up just this morning) now owns all of the repos for my CPAN modules. I won’t be adding anybody else’s repos to this organisation, but if you send PRs to any of these projects (and I’m looking at you, Mohammad) then don’t be surprised if you get added to the organisation too! If you’ve watched my talk on GitHub Actions for Perl Development, then you might remember that I was developing a GitHub Workflow definition for releasing modules to CPAN. That’s still a work in progress, but now I’m thinking that I could add my PAUSE credentials to the GitHub secrets store for this organisation and the GitHub workflow could release the code to CPAN using my credentials without my input (but, obviously, I’m still considering the security implications of that – it would certainly only ever be available to me and a few trusted lieutenants).
This is still all very new and it will definitely develop in several directions over the coming months. But it feels like the a move in the right direction. After twenty years of CPAN releases, it feels like I’m turning my work into a “proper” project. And, hopefully, it can serve as a template that other people can follow. I’ll let you know how it goes.
So, what do you think? Is this the right model for CPAN development (and, also, Perl infrastructure development) moving forward? Would you be interested in joining either of these organisations? Do you have any tools that the Perl Tools Team could maintain for you?
The post GitHub Organisations appeared first on Perl Hacks.
It’s been over twenty years since I spoke at a conference in North America. That was at OSCON in San Diego. I’ve actually never spoken at a YAPC, TPC or TPRC in North America. I have the standard European concern about being seen to encourage the USA’s bad behaviour by actually visiting it, so when I saw this year’s TPRC was in Canada, I thought that gave me the perfect opportunity to put that right.
So I proposed a talk which was accepted.
It was also the first time I’d been to any kind of conference since before the pandemic. My last conference was in Riga in 2019.
Despite Air Transat’s determination to prevent it from happening, my girlfriend and I made it to Toronto a few days before the conference started. It was her birthday, so we spent Sunday and Monday relaxing and getting to know Downtown Toronto. On Monday afternoon, we moved to the conference hotel and prepared to geek out.
One of the first people I spoke to at the conference on Tuesday morning was fellow Londoner Mohammad Anwar. As is the law (I don’t make the rules!) I was mildly rebuking him about the ridiculous amount of work he puts into the Perl community. I told him the story of a senior member of the community who, about ten years ago, said to me: “I don’t understand why you still make so much effort, Dave. You have your White Camel, don’t you?” I swear I didn’t know that Mohammad was about to be awarded the 2022 White Camel – but it gave me the opportunity to go up to him and say, “See, you can stop making such an effort now!” I hope he doesn’t really stop; but he should really take things a bit easier.
The next three days were a happy blur of geekery. As always at these conferences, there were too many talks that I wanted to see and, inevitably, I still have to catch up on some of them on YouTube (thanks to the dedicated video team for getting making them available so quickly).
There are a number of talks that I’d like more people to see. I think it would be a great use of your time to watch these videos:
I gave two talks – a lightning talk on CPAN Dashboard and a longer talk on GitHub Actions for Perl Development. After not giving a talk for four years, I felt a little rusty – but I think they went ok.
And then, after a seemingly-fleeting three days, it was all over and we all returned to our own countries. There’s another conference in Finland next month. Unfortunately, I’m unable to be there – and last week’s experience makes me regret that.
It was great to catch up with old friends and share our mutual interest in Perl. It was particularly great after four years without a conference to go to. I hope it’s not four years until I’m at another.
The post The Perl and Raku Conference, Toronto 2023 appeared first on Perl Hacks.
It’s been over twenty years since I spoke at a conference in North America. That was at OSCON in San Diego. I’ve actually never spoken at a YAPC, TPC or TPRC in North America. I have the standard European concern about being seen to encourage the USA’s bad behaviour by actually visiting it, so when […]
The post The Perl and Raku Conference, Toronto 2023 appeared first on Perl Hacks.
It’s been over twenty years since I spoke at a conference in North America. That was at OSCON in San Diego. I’ve actually never spoken at a YAPC, TPC or TPRC in North America. I have the standard European concern about being seen to encourage the USA’s bad behaviour by actually visiting it, so when I saw this year’s TPRC was in Canada, I thought that gave me the perfect opportunity to put that right.
So I proposed a talk which was accepted.
It was also the first time I’d been to any kind of conference since before the pandemic. My last conference was in Riga in 2019.
Despite Air Transat’s determination to prevent it from happening, my girlfriend and I made it to Toronto a few days before the conference started. It was her birthday, so we spent Sunday and Monday relaxing and getting to know Downtown Toronto. On Monday afternoon, we moved to the conference hotel and prepared to geek out.
One of the first people I spoke to at the conference on Tuesday morning was fellow Londoner Mohammad Anwar. As is the law (I don’t make the rules!) I was mildly rebuking him about the ridiculous amount of work he puts into the Perl community. I told him the story of a senior member of the community who, about ten years ago, said to me: “I don’t understand why you still make so much effort, Dave. You have your White Camel, don’t you?” I swear I didn’t know that Mohammad was about to be awarded the 2022 White Camel – but it gave me the opportunity to go up to him and say, “See, you can stop making such an effort now!” I hope he doesn’t really stop; but he should really take things a bit easier.
The next three days were a happy blur of geekery. As always at these conferences, there were too many talks that I wanted to see and, inevitably, I still have to catch up on some of them on YouTube (thanks to the dedicated video team for getting making them available so quickly).
There are a number of talks that I’d like more people to see. I think it would be a great use of your time to watch these videos:
I gave two talks – a lightning talk on CPAN Dashboard and a longer talk on GitHub Actions for Perl Development. After not giving a talk for four years, I felt a little rusty – but I think they went ok.
And then, after a seemingly-fleeting three days, it was all over and we all returned to our own countries. There’s another conference in Finland next month. Unfortunately, I’m unable to be there – and last week’s experience makes me regret that.
It was great to catch up with old friends and share our mutual interest in Perl. It was particularly great after four years without a conference to go to. I hope it’s not four years until I’m at another.
The post The Perl and Raku Conference, Toronto 2023 appeared first on Perl Hacks.
Rather later than usual (again!) here is my review of the best ten gigs I saw in 2022. For the first time since 2019, I did actually see more than ten gigs in 2022 although my total of sixteen falls well short of my pre-pandemic years. Here are my ten favourite gigs of the year.… Continue reading 2022 in Gigs
The post 2022 in Gigs appeared first on Davblog.
Using artificial intelligence (AI) to generate blog posts can be bad for search engine optimization (SEO) for several reasons.
First and foremost, AI-generated content is often low quality and lacks the depth and substance that search engines look for when ranking content. Because AI algorithms are not capable of understanding the nuances and complexities of human language, the content they produce is often generic, repetitive, and lacks originality. This can make it difficult for search engines to understand the context and relevance of the content, which can negatively impact its ranking.
Additionally, AI-generated content is often not well-written or structured, which can make it difficult for readers to understand and engage with. This can lead to a high bounce rate (the percentage of visitors who leave a website after only viewing one page), which can also hurt the website’s ranking.
Furthermore, AI-generated content is often not aligned with the website’s overall content strategy and goals. Because AI algorithms are not capable of understanding the website’s target audience, brand voice, and core messaging, the content they produce may not be relevant or useful to the website’s visitors. This can lead to a poor user experience, which can also hurt the website’s ranking.
Another issue with AI-generated content is that it can be seen as spammy or low quality by both search engines and readers. Because AI-generated content is often produced in large quantities and lacks originality, it can be seen as an attempt to manipulate search engine rankings or trick readers into engaging with the website. This can lead to the website being penalized by search engines or losing the trust and loyalty of its visitors.
In conclusion, using AI to generate blog posts can be bad for SEO for several reasons. AI-generated content is often low quality, poorly written, and not aligned with the website’s content strategy. It can also be seen as spammy or low quality by both search engines and readers, which can hurt the website’s ranking and reputation. It is important for websites to prioritize creating high-quality, original, and relevant content to improve their SEO and provide a positive user experience.
[This post was generated using ChatGPT]
The post 5 Reasons Why Using AI to Generate Blog Posts Can Destroy Your SEO appeared first on Davblog.
Using artificial intelligence (AI) to generate blog posts can be bad for search engine optimization (SEO) for several reasons. First and foremost, AI-generated content is often low quality and lacks the depth and substance that search engines look for when ranking content. Because AI algorithms are not capable of understanding the nuances and complexities of… Continue reading 5 Reasons Why Using AI to Generate Blog Posts Can Destroy Your SEO
The post 5 Reasons Why Using AI to Generate Blog Posts Can Destroy Your SEO appeared first on Davblog.
‘Okay Google. Where is Antarctica?”
Children can now get answers to all their questions using smart speakers and digital voice assistants.
A few years ago, children would run to their parents or grandparents to answer their questions. But with the ascendence of voice assistants to the mainstream in recent years, many children rely more on technology than humans.
Is this a good idea?
How does it impact the children?
When children interact with people, it helps them be more thoughtful, creative, and imaginative.
When they use artificial intelligence instead, several issues come into the foreground. These include access to age-inappropriate content and increasing the possibility of being rude or unpleasant, affecting how they treat others.
As mentioned, technology has both pros and cons. There are benefits to children using these devices, including improving diction, communication, social skills, and gaining information without bothering their parents.
Many families find that smart speakers like Amazon Echo and Google Home are useful. They use them for several functions, ranging from answering questions to setting the thermostat. Research shows that up to nine out of ten children between the ages of four and eleven in the US are regularly using smart speakers — often without parental guidance and control. So, what is the best approach for a parent to take?
Children up to seven years old can find it challenging to differentiate between humans and devices, and this can lead to one of the biggest dangers. If the device fulfils their requests through rude behaviour, children may behave similarly to other humans.
Most parents consider it essential that smart devices should encourage polite conversations as a part of nurturing good habits in children. The Campaign for a Commercial-Free Childhood or CCFA is a US coalition of concerned parents, healthcare professionals, and educators. Recently, CCFA protested against Amazon Echo Dot Kids Edition, stating that it may affect children’s wellbeing. Because of this, they requested parents avoid buying Amazon Echo.
However, in reality, these smart devices have improved a lot and focus on encouraging polite conversations with children. It is all about how parents use and present these devices to their children, as these factors can influence them a lot.
But in simple terms, parents wish these devices to encourage politeness in their children. At the same time, they want their kids to understand the difference between artificial intelligence and humans while using these technological innovations.
Many parents have seen their children behave rudely to smart speakers. Several parents have expressed their concerns through social media, blog posts and forums like Mumsnet. They fear these behaviours can impact their kids when they grow up.
A report published in Child Wise reached the conclusion that children who behave rudely to smart devices might be aggressive while they grow up, especially while dealing with other humans. It is, therefore, preferable if children use polite words while interacting with both humans and smart devices.
With interventions and rising concerns addressed by parents and health professionals, some tech companies have brought changes to virtual assistants and smart speakers.
The parental control features available in Alexa focus on training kids to be more polite. Amazon brands it as Magic Word, where the focus is on bringing positive enforcement. However, there is no penalty if children don’t speak politely. Available on Amazon Echo, this tool has added features like setting bedtimes, switching off devices, and blocking songs with explicit lyrics.
When it comes to Google Home, it has brought in a new feature called Pretty Please. Here, Google will perform an action only when children use, please. For instance, “Okay, Google. Please set the timer for 15 minutes.”
You can enable this feature through the Google Family Link, where you can find the settings for Home and Assistant. You can set these new standards for devices of your preference. Also, once you use it and figure things out, there will be no more issues in setting it up again.
These tools and their approaches are highly beneficial for kids and parents. As of now, these devices only offer basic features and limited replies. But with time, there could be technological changes that encourage children to have much more efficient and polite interactions.
It was thinking about issues like this which led me to write my first children’s book — George and the Smart Home. In the book, George is a young boy who has problems getting the smart speakers in his house to do what he wants until he learns to be polite to them.
It is available now, as a paperback and a Kindle book, from Amazon.
Buy it from: AU / BR / CA / DE / ES / FR / IN / IT / JP / MX / NL / UK / US
The post Should Children be Polite While Using Smart Speakers? appeared first on Davblog.
‘Okay Google. Where is Antarctica?” Children can now get answers to all their questions using smart speakers and digital voice assistants. A few years ago, children would run to their parents or grandparents to answer their questions. But with the ascendence of voice assistants to the mainstream in recent years, many children rely more on… Continue reading Should Children be Polite While Using Smart Speakers?
The post Should Children be Polite While Using Smart Speakers? appeared first on Davblog.
A little later than usual, here’s my review of the gigs I saw last year.
In 2020, I saw four gigs. In 2021, I almost doubled that to seven. Obviously, we spent a lot of the year with most music venues closed, so those few gigs I saw were all in the second half of the year. Usually, I’d list my top ten gigs. This year (as last year) I’ll be listing them all. So here they are in chronological order.
And that was 2021. What will happen in 2022? Well, I have tickets for a dozen or shows but who knows how many of them I’ll actually see? I’ve already had emails postponing the Wolf Alice and Peter Hook shows I was going to see this month. I guess I’ll just have to wait and see how the rest of the year pans out.
The post 2021 in Gigs appeared first on Davblog.
A little later than usual, here’s my review of the gigs I saw last year. In 2020, I saw four gigs. In 2021, I almost doubled that to seven. Obviously, we spent a lot of the year with most music venues closed, so those few gigs I saw were all in the second half of… Continue reading 2021 in Gigs
The post 2021 in Gigs appeared first on Davblog.