Quite, I love the esthetic of the watch stand. The exposed lacquered wire makes it look really exotic, and in general the execution is top notch. It is really tempting to throw stuff like this together and call it a day when it works (that's more my style...), so I have a lot of respect for people that can finish these things to the point where they not only work but also look good.
I currently use Banktivity which is OK. Would love to hear from any others that have used Banktivity and migrated to something else. Ideally, there should be OFX support.
FYI you should have used llama.cpp to do the benchmarks. It performs almost 20x faster than ollama for the gpt-oss-120b model. Here are some samples results on my spark:
Is this the full weight model or quantized version? The GGUFs distributed on Hugging Face labeled as MXFP4 quantization have layers that are quantized to int8 (q8_0) instead of bf16 as suggested by OpenAI.
Example looking at blk.0.attn_k.weight, it's q8_0 amongst other layers:
One of the quite expensive paid plans, as the free one has to have "Created with Datawrapper" attribution at the bottom. I would guess they've vibe-coded their way to a premium version without paying, as the alternative is definitely outside individual people's budgets (>$500/month).
Inspecting the page, I can see some classes "dw-chart" so I looked it up and got to this: https://www.datawrapper.de/charts. Looks a bit different on the page, but I think that's it.
I saw a TikTok of someone saying that farmers are not stupid (due to the wide variety of skills to successfully farm) and were just betting on Trump not actually going through with tariffs.
It's hard to have any sympathy for such cynical behavior while simultaneously asking for handouts. Especially since the same people probably voted against others getting social services.
It also hurts when I drop the iPad mini on my face. In fact, I was considering getting a Pro Max to replace both a iPhone Pro and iPad mini combo but figured it might too big of a compromise.
I wonder if anyone has successfully gone down this path.
reply