So Painless – Upgrading to LineageOS 22 – Android 15

Just a quick note today about upgrading my Pixel 6 to from LineageOS 21 to LineageOS 22, which is Android 15 without the Google privacy invading parts:

While standard updates can be done on the phone itself, upgrading from one major version to the next requires side-loading. I’m always a bit weary of doing this, because I’m not keen on bricking a device that I use 24/7, or to reinstall stuff. But LineageOS points out that going from one major version to the next with sideloading will keep the data in place. I’ve done it twice now on my Pixel 6 and it worked both times.

So here is the deal: It only takes about 10 minutes and 2 shell commands on the PC: One adb command to reboot in sideloading mode and one adb command to sideload the latest Lineage ...signed.zip image. And that’s it.

For the full details on how to update from one major version to the next, have a look here for all update options or go straight here for the ‘manual’ major version jump info.

HDD Performance – Part 2 – Huge Files Write Performance

In the previous post on this topic, I had a closer look at the write performance of the best backup hard drive I had in stock at home. I bought that drive recently, because I was running out of space on my other drives and because they somehow did not seem to behave as snappy as when I initially bought them. So let’s have a look and compare.

Continue reading HDD Performance – Part 2 – Huge Files Write Performance

10 Years of Fiber in Paris

Incredible! Today I realized that it’s the 10th anniversary of fiber connectivity at our home in Paris! 10 YEARS! Here’s my original post from November 2024. I am a bit speechless. These days, I could even upgrade to a 10 Gbit/s connection. Overall, I’d say it’s a success story, with a few bumps and bruises in between. The biggest one was certainly 2 years ago, when it took 4 months after my fiber line failed to get it back in service. But OK, you live and you learn. Forget competition and resellers, just go to the company that owns the fiber. But I don’t want to dwell on this today. 10 YEARS!

HDD Performance – Part 1 – Huge Files on a New 20TB Drive

My data heap keeps growing and I do have a good multi-layer and multi-location backup strategy. Offline and off-site storage is the motto of the day, which requires hard disks with large capacities so data can be physically moved. So far, I used several 8 TB hard disks to which I would sync the data from various sources. I’ve come to a point however, where 8 TB is no longer enough and incidentally, I noticed a significant slow down during my backup procedures. So I bought my first 20 TB drive which, so far, performs very nicely. But I really do wonder why my 8 TB drives seem to have slowed down so much while that new shiny 20 TB drive (still?) performs much better. So it was time to do some benchmark tests with different drives and real world data so I can see how new drives perform with my data and analyze performance of existing drives. But why do I care? Because it makes a huge difference if 10 TB of data is moved to or from a disk drive at an average of 50 MB/s or 200 MB/s. At 50 MB/s, moving such an amount of data requires 55 hours, while at 200 MB/s it only takes 13 hours. And we are not even talking 20 TB yet. You see where this goes…

Continue reading HDD Performance – Part 1 – Huge Files on a New 20TB Drive

500 Mbps Bandwidth Throttling – Part 2

In the previous post, I’ve had a look at how a high speed data transmission is throttled to 500 Mbps between two data centers in different countries. In this post, I’ll have a look how the TCP sequence- and transmission window graphs look like for the same throttling scenario when I downloaded data from my server over an FTTH fiber line in Paris.

Continue reading 500 Mbps Bandwidth Throttling – Part 2

500 Mbps Bandwidth Throttling – Part 1

A few months ago, I moved my services such as this blog from a bare metal server in a data center in Finland to another bare metal server in France. One drawback of the move was that the bandwidth to the server is limited to 500 Mbps instead of the 1 Gbps the network interface could provide. And indeed, the data center operator does enforce the 500 Mbps limit in the downlink direction. Recently, I wondered how that is actually done in practice and had a closer look with Wireshark. As you can see above, the result is quite interesting!

Continue reading 500 Mbps Bandwidth Throttling – Part 1

Bucket Watching – S3 at Hetzner and Scaleway

I’m old school, I like locally attached block devices for data storage. Agreed, we are living in the age of cloud, but so far, the amount of data I store cloud at home and in data centers could always be placed on block devices, i.e. flash drives directly connected to that server. Recently, however, I’ve been thinking a bit about how to store images and videos in the cloud and how to upload and synchronize such data from different devices. That means that a few hundred gigabyte will definitely not do anymore, we are quickly talking about TBs here. Locally attached or network block storage in a data center of such a magnitude is quite expensive, we are talking 50 to 100 euros a month per TB. But perhaps there is another option? Many cloud providers also offer S3 compatible object storage today at one tenth of the cost, i.e. €6 per TB per month. Could that be an alternative?

Continue reading Bucket Watching – S3 at Hetzner and Scaleway

Quantum Safe – Some Thoughts

There is quite a bit of momentum in the industry right now to prepare for the day when quantum computers have become powerful enough to break today’s authentication- and encryption algorithms. Here’s a video that explains the issue to a general audience. All of this got me thinking about what that means for my data, particularly the data I exchange with my self-hosted cloud services today. So here are some thoughts on the topic, to be revisited from time to time as the topic comes up again.

Continue reading Quantum Safe – Some Thoughts

Note Taking with Joplin

While listening to talks at conferences, I tend to take notes either with pen and paper or on my phone or notebook with the Nextcloud note app. While pen and paper notes can be structured nicely, the notes often don’t make it to my digital archive later on and are thus pretty much lost. Hence, they are only useful during the event and perhaps a few days after. Using the Nextcloud notes app is better in this regard, but I can’t quite get my head around how to produce structured markup text. Then, back in summer, I saw someone using an app during a conference that supports structured text input with markup characters: Joplin. It looked interesting, and I finally managed to give the app a try on my notebook and phone during 38c3 at the end of 2024 for taking notes during talks. So let’s have a look how that went.

Continue reading Note Taking with Joplin

Wireguard – Ubuntu 24.04 Client

And I’m back with yet another post on Wireguard. After Ubuntu 22.04 has left some mixed feelings when it came to Wireguard, I was positively surprised to see that Wireguard has been finally integrated into Ubuntu 24.04’s GUI. So let’s see if the shortcomings the command line tools have brought with them in previous Ubuntu versions have also been addressed.

Continue reading Wireguard – Ubuntu 24.04 Client