In the first part of this series, I’ve shown you how to deploy Vaultwarden on Azure Container Apps. In this article, I will show you how to backup the data and restore it in case of a disaster.
Vaultwarden stores all data in a SQLite database. As shown in part 1, I’m storing all data in an Azure storage account file share.
The easiest way to backup all the data is to copy the database files and all other files from the file share to another location.
However, the SQLite database files are in use if the container is running. Therefore I’m using the following script to stop the container, copy the files and start the container again. This just takes a few seconds and if you do this in the night, you will not notice any downtime.
I’m using the Azure CLI to stop and start the container. The script runs on one of my computers once a day.
Replace the
TODO_YOUR_STORAGE_ACCOUNT_NAME
,TODO_YOUR_ACCOUNT_KEY
,TODO_RESOURCE_GROUP_NAME
andTODO_CONTAINER_APP_NAME
with the values from your storage account.
# Enable these two lines if you are using the storage account network rules.# This will allow access to the storage account from everywhere.#az storage account update --default-action Allow --name TODO_YOUR_STORAGE_ACCOUNT_NAME#start-sleep 60# Get the current active revision$rev = az containerapp revision list -n TODO_CONTAINER_APP_NAME -g TODO_RESOURCE_GROUP_NAME --query [0].name --output tsv# Deactivate the current revisionaz containerapp revision deactivate -n TODO_CONTAINER_APP_NAME -g TODO_RESOURCE_GROUP_NAME --revision $rev# Download the files from the storage account file shareaz storage file download-batch --account-key TODO_YOUR_ACCOUNT_KEY --account-name TODO_YOUR_STORAGE_ACCOUNT_NAME --destination ./vaultwarden --no-progress --source vaultwarden# Reactivate the revisionaz containerapp revision activate -n TODO_CONTAINER_APP_NAME -g TODO_RESOURCE_GROUP_NAME --revision $rev# If you are using the storage account network rules, disable access from everywhere again.#az storage account update --default-action Deny --name TODO_YOUR_STORAGE_ACCOUNT_NAME
In case of an emergency, you can restore the Vaultwarden instance with the steps from part 1. Before you start the container instance, just copy all the data back to the storage account file share, then start the container and you are done.
If you have any questions, please let me know in the comments or reach out to me on these other channels.
]]>In the first part of this series, I’ve shown you how to deploy Vaultwarden on Azure Container Apps. In this article, I will show you how to use a custom domain with a free Lets Encrypt TLS certificate.
⚠️ UPDATE 2023/06: Container Apps supports now a free managed certificate. The Container Apps Acmebot is no longer required!
For now, Azure Container Apps support custom domains but no auto managed TLS certificates. So, we will use an Azure Function to connect to Lets Encrypt, get the certificate and store and assign it to the container app.
Luckily there is a great solution on Github here: Container Apps Acmebot
Before you go on, make sure to add a TXT and a CNAME record for your custom domain with the data from the container app. You can find the data in the Azure Portal in the Custom domains
section.
It is really easy to use. You can deploy it in the same Azure subscription where you already deployed Vaultwarden.
After install (via Deploy to Azure button on the Github repo), you open a simple website and select the domain and the container app to use and that’s it.
After a few minutes, you can see the certificate in the Azure Portal within the container app:
If you have any questions, please let me know in the comments or reach out to me on these other channels.
]]>If you want to get rid of the hassle of managing your passwords in Excel or you want to dump LastPass (after the latest breach), 1Password or other cloud-based solutions you should take a look at Vaultwarden.
It is a Bitwarden clone. Bitwarden is a cloud password manager, but it is also open source and can be self-hosted. You can run it on your own server or in the cloud. However it is under your control and you can decide what to do with your data.
Vaultwarden is an unofficial Bitwarden server implementation written in Rust. It can be run for example on a Raspberry Pi or in a Docker container and you can make it public or run it in your own private/home network.
In this article, I will show you how to deploy Vaultwarden on Azure Container Apps for less than 1 EUR per month (if you do not want the extra network security. Otherwise Azure automatically creates a load balancer for ~18$/month).
UPDATE: I was wrong regarding the pricing. I’ve set the scaling to minReplicas=1 which means there is always one running instance. This is then ~10EUR/month. It is only cheap as 1 EUR if you scale to 0 instances if there are no requests. I’ve updated the bicep/arm template to use 0 as the minReplicas setting.
Vaultwarden stores all data in a SQLite database within the file system (encrypyted of course). I’m using a file share in a storage account which will be mounted as a volume in the container. This way the data will be persisted even if the container is restarted or the container app is deleted. If you run it without a mounted volume, all data will be lost when the container is restarted.
In the Vaultwarden repository are some discussions about the SQlite database in an Azure storage account. I’ve tested it and it works fine with WAL enabled. I had to explicitly set the ENABLE_DB_WAL
environment variable to true
to enable it. If you don’t do this, the database will be locked and you will get an error message when you try to access the web interface.
First of all you need an Azure account. If you don’t have one, you can create a free account here. You need to enter a credit card to verify your identity, but the solution will cost you less than 1 EUR per month (if you are using it for personal use with your family members).
Update: You can now decide whether you want virtual network integration or not. If you choose virtual network integration Azure automatically creates a loadbalancer for you which you have to pay for.
You can do this within the Azure portal directly, however there are some things that are not possible right now in the UI (like mounting a file share as a volume in the container app).
Therefore I’ve created a Bicep template that will create the following resources:
You can find the Bicep template and instructions how to deploy it here or you can just click this button:
The container app references the vaultwarden docker hub image and will run it automatically. You can customize the installation by setting some environment variables. All configuration options can be found here in the Vaultwarden wiki
The storage account only allows access from the virtual network where the container app resides if you choose to enable virtual network support.
There is an issue with the SQLite database on an Azure File Share. The database cannot be created automatically. Therefore you need to put an empty Vaultwarden database upfront into the file share.
⚠️ Important: Before you can proceed make sure to make the storage account accessible. Go to the storage account Networking
tab and check Enabled from all networks
. After you copied the files as decribed below, make sure to switch back to Enabled from selected virtual networks and IP addresses
.
Here is an empty database that you can use. Just download the zip file, unzip all 3 files and upload them to the file share:
vaultwarden
Upload
and upload the 3 filesAfter the deployment is finished and you have created the empty database, you can access the web interface of Vaultwarden by clicking on the URL in the container app overview.
You need to register a new user and then you can start adding your passwords.
I recommend to use a strong master password and enable MFA for all users if you are running Vaultwarden on the public internet.
If you have any questions, please let me know in the comments or reach out to me on these other channels.
]]>Long story short: If you have a personal workspace, create a team and then decide you do not want a team anymore and delete it, all your personal workspaces and collections are automatically deleted.
I wanted to be smart and took my second device offline and opened Postman. However, Postman says: You are offline, so you cannot access your synced workspaces.
That means: Everything is gone and cannot be recovered.
That was the trigger to move to another tool. I tried Insomnia and it is a great alternative to Postman. It is open source and has a lot of features.
You do not even need a paid account, but you can easily sync your workspaces with git to any repository you want. It also has a lot of plugins you can use to extend the functionality.
To start a new collection, you should create a new Design Document
not a Request Collection
. The advantage of a Design Document
is that you can work on an OpenAPI specification or create requests from scratch. Also git sync is only available for Design Documents
and you get these 3 tabs:
The Design
tab is where you can create your requests as an OpenAPI specification or import from a yml or json file. The Debug
tab is where you can test your requests or create new requests. The Test
tab is where you can create test suites with tests for your requests. As you can see in the following image the tests are written in javascript, so you have a lot of options.
I am really happy with Insomnia and I will use it for all my API testing. It is a great alternative to Postman and I can recommend it especially for the plugins and the free git sync.
]]>Die Ladestation hat außerdem einen Relaisausgang der bei Alarm geschaltet wird. Das eröffnet natürlich eine Menge Möglichkeiten dies in mein Smart Home zu integrieren.
Dazu habe ich den Relaisausgang auf einen Eingang meiner Loxone Anlage gelegt. Von dort trigger ich einen NodeRED Flow mit 3 wesentlichen Teilen:
Im oberen Bereich (gelb) sende ich eine Toast Benachrichtigung an meinen Fernseher. Dieser zeit dann oben rechts ein kleines Popup an (natürlich nur sofern der Fernseher gerade läuft).
Im mittleren Bereich (blau) prüfe ich mit einem Loxone Baustein, ob sich das Haus im “Nacht-Modus” befindet und schalte dann das Flur Licht ein (und nach 3 Minuten wieder aus). Das passiert natürlich auch durch den Präsenzmelder, wenn ich in den Flur gehe, aber so kann man vom Bett aus schon mal Licht sehen und weckt so nicht direkt alle anderen im Haus. Außerdem deaktiviere ich die Alarmanlage, so dass beim Öffnen der Haustür nicht direkt der Alarm los geht.
Im unteren Bereich (rot) frage ich ab, ob der Tesla zu Hause ist. Dazu nutze ich die mehr oder weniger offizielle Tesla API. Mit der wecke ich dann schon mal das Auto um es daraufhin direkt aufzuschließen. Da man nicht so genau prüfen kann, wann das Auto erfolgreich aufgeweckt wurde, versuche ich einmal direkt (könnte ja bereits wach sein) und dann nach 10 und nochmal nach 30 Sekunden das Auto aufzuschließen.
Wenn das erfolgreich war, prüfe ich noch ob es ggf. gerade an der Ladestation hängt. Wenn das der Fall ist, deaktiviere ich meine Wallbox, was dann direkt den Stecker freigibt und ich diesen somit sofort abziehen kann.
Der nächste Einsatz kann kommen.
]]>Du willst auch einen Tesla kaufen? Dann verwende gerne meinen Referral Link. Damit bekommen wir dann beide 1500km gratis Supercharging. https://ts.la/marco81154
Hier gibt es eigentlich nichts zu sagen. Vorteil am Tesla ist das gut ausgebaute Supercharger Netzwerk. Vor allem da diese eigentlich immer funktionieren und man einfach nur den Stecker reinstecken muss. Keine Ladekartensammlung und x verschiedene Abrechnungssysteme. Außerdem ist das Auto schneller fertig mit laden als die Kinder mit essen.
Im ersten Jahr und für Strom für maximal 1500km ist super charging kostenfrei gewesen, denn ich habe bei der Bestellung einen Referral Link verwendet. Insgesamt habe ich davon aber maximal 1000km genutzt, da die Supercharger ja nur auf der Langstrecke wirklich relevant sind. Sonst wurde immer zu Hause geladen. Am Urlaubsort dann an entsprechenden AC Ladesäulen in der Regel mit Maingau, da es dort nur einen Tarif gibt und der im Vergleich auch noch am kostengünstigsten (0,35 EUR/kWh) ist.
Das gute an dem Auto gegenüber den “klassischen” Autos ist, dass das Auto beim Kauf nicht direkt veraltet ist. Stattdessen wird es immer besser, denn es gibt over-the-air Updates. Und zwar jede Menge. Im Durchschnitt etwa alle 4 Wochen:
Seit ich das Auto habe, kamen so Funktionen wie Netflix und Spiele hinzu, aber auch Sinnvolle Dinge wie die Anzeige von Fremd-Ladestationen im Navi oder auch viele Autopilot Verbesserungen (und manchmal auch wieder Rückschritte.).
Im Prinzip ist das Auto unverändert. Da der Zugang zum Kofferraum allerdings arg in der Höhe beschränkt ist und man manchmal richtig rein klettern muss, habe ich die Standardbeleuchtung durch richtig helle Lampen* ersetzt. Damit sieht man jetzt auch, was ganz hinten drin ist.
Außerdem haben wir einen Dachgepäckträger angeschafft. Allerdings nicht den Originalen von Tesla. Dieser soll nämlich allein für die Träger 480 EUR kosten.
Es gibt aber gute und günstige Alternativen. Nach einiger Recherche in diversen Foren habe ich mich für diese hier* entschieden. Sie werden einfach drauf gesetzt und geklemmt und sind somit sehr gut geeignet für das Glasdach.
Es passen 2 Fahrräder oben drauf und die zwei Kinderfahrräder hinten rein. Noch zumindest und nur wenn man nicht noch groß Gepäck dabei hat. Für Tagestouren perfekt. Für den Urlaub kommen dann nur die Kinderfahrräder auf das Dach und wir leihen uns dann welche vor Ort.
Ich bin im November letzten Jahres vom Tesla API Scraper umgestiegen auf Teslamate. Das wird sehr aktiv weiterentwickelt, läuft ausschließlich lokal (nicht in der Cloud) und hat viele gute Auswertungsmöglichkeiten. Außerdem eine MQTT Schnittstelle, so dass das Fahrzeug gut in die Hausautomatisierung z.B. über NodeRED eingebunden werden kann.
Ich schalte z.B. die Ladestation im Carport ein oder aus je nachdem ob das Auto zu Hause ist.
Dann war es soweit. Auf den Tag genau ein Jahr nach Abholung passierte erst das:
Und dann das:
Offenbar ist ein Fehler in der Drive Unit, also dem Motor das Problem. Das Servicecenter in Hamburg hat aktiv mit mir kommuniziert und war sehr schnell. Sie hatten diesen Fehler in Hamburg wohl bisher aber noch nie… Jedenfalls musste die gesamte Drive Unit getauscht werden. Zum Glück ist der Fehler nicht weit von zu Hause bei Freunden aufgetreten und nicht mitten auf der Autobahn oder im Urlaub. Inzwischen ist wieder alles in Ordnung und ich bin zuversichtlich, dass das das einzige Problem bleibt.
Links mit * sind Amazon Affiliate Links
]]>I have a Tesla Model 3 and it is a computer on wheels. Next-level IoT ;-)
I’m using the Tesla API Scraper to log some data from the car to a local InfluxDb. It has a nice Grafana dashboard like this one:
My main developer machine is a VM in Azure (see previous post). It shutsdown automatically every evening to save costs, but I have to start it manually in the morning.
Now that the InfluxDb contains near real-time location data I’ve created a simple .NET core console app to connect to the InfluxDb, gets the latest location and check the distance to my work location.
If the car is near my work location the console app posts to an Azure Automation WebHook to start the VM. Awesome!
To track my personal expenses throughout a month, I have created a simple web form with Vue.js and Vuetify and the Azure Storage JavaScript Client Libraries. It is hosted on an Azure static web site.
On save, the entry is stored in an Azure storage queue.
From there I have created a Flow to append the data to an existing Excel sheet hosted on SharePoint Online.
Microsoft just launched the private preview to understand receipts with Form Recognizer.
So to make it even easier to enter new receipts, I have added a capture button to my web app to make a photo from the receipt and send it to the Form Recognizer API.
The API returns a JSON response with fields for Total
, MerchantName
, TransactionDate
, and a lot more. I grab them and populate the input fields.
yo code
The source of my extension can be found here in my GitHub repo
To connect to the Azure DevOps REST API you need a personal access token, which can be obtained from https://dev.azure.com/YOUR_ORG -> Your Account -> Security
The API documentation can be found here.
The specific API I wanted to use is missing in the documentation which maybe means it is not for public use (yet?). However the DevOps portal is using the same API so it works for now:
GET https://dev.azure.com/[YOUR_ORG]/_apis/distributedtask/resourceusage?parallelismTag=Private&poolIsHosted=true&includeRunningRequests=true
The parameters are important to see all the running tasks from the hosted agents. The result contains a variable usedCount
with the number of currently running jobs. However to get the details who is building what you need a second call to the details API, which comes back as a result from the first call. See here:
data.runningRequests[0].owner._links.self.href
To access the REST API you have to use BASIC auth with a dummy username and the PAT (personal access token) as the password as you can see here (I’m using the request HTTP client library to call the API).
]]>First of all, you need an Azure Subscription. Then in the Azure Portal click Create a resource
then search for Azure Search
.
My blog is static HTML generated by Hexo. It also generates a content.json
file containing all posts (titles, text, publish date, tags).
This file is stored in Azure Blob Storage where the Search Service easily can index it. Just click Import data
.
Here is the configuration:
You can optional add some cognitive services to extract special entities like company names or do some text analysis. After that you have to specifiy what data you want in the index:
After the indexing is complete, you can query the search service via a REST API. You can test it in the search query explorer directly from within the Aure Portal.
I’ve added some lines of Javascript to my blog to fetch the search results and here we go.
To make it easier to find what you are looking for, I shoud add a tag filter and some date sorting.
]]>