Let me tell you about the most wasteful US federal government spending I know about. It's a humdinger. You and everyone you know are mired in it for weeks, or perhaps months, every year. It will cost you, personally, thousands of dollars over your lifetime. I'm talking about filing your taxes.
Not paying your taxes. Paying your taxes is fine. It keeps the country running, though not because the government needs our "tax dollars" to pay for things. The government annihilates the money it taxes away from us, and creates new money to pay for programs. The USA needs US citizens' dollars to build highways the same way Starbucks needs its Starbucks gift cards to make lattes – that is, not at all:
https://theglobepost.com/2019/03/28/stephanie-kelton-mmt/
I'm talking about filing your taxes. In nearly every case, a tax return contains a bunch of things the IRS already knows: how much interest your bank paid you, how much your employer paid you, how many kids you have, etc etc. Nearly everyone who pays a tax-prep place or website to file their tax return is just sending data to the IRS that the IRS already has. This is insanely wasteful.
In most other "advanced" countries (and in plenty of poorer countries, too), the tax authority fills in your tax return for you and mails it to you at tax-time. If it looks good to you, you just sign the bottom and send it back. If there are mistakes, you can correct them. You can also just drop it in the shredder and hire an accountant to do your taxes for you, if, for example, you run a small business, or are self-employed, or have other complex tax needs. A tiny minority of tax filers fall into that bucket, and they keep the tax-prep industry in other countries alive, albeit in a much smaller form than in the USA.
In the US, we have a duopoly of two gigantic tax-prep outfits: H&R Block, and Intuit, owners of Turbotax. These companies make billions from low-income, working Americans every year, charging them to format a bunch of information the IRS already has, and then sending it to the IRS on their behalf. These companies lobbied like crazy for the right to tax you when you pay your taxes.
In 2003, it looked like the IRS would start sending Americans pre-completed returns, so H&R Block and Turbotax went into lobbying overdrive, whipping up a "public private partnership" called the "Free File Alliance," that promised to do free tax prep for most Americans. But once the threat of IRS free filing was killed, they turned Free File into a sick joke. Americans who tried to use Free File were fraudulently channeled into filing products that cost money – sometimes hundreds of dollars – to use, a fact that was only revealed after the taxpayer had spent hours keying in their information. Free File sites were also used to peddle unrelated financial products to tax filers, with deceptive language that implied that buying these services was needed to file your return:
The big winner from the Free File scam was Intuit, which bought Turbotax in 1993. They made about one billion dollars per year ripping off Americans they'd promised to file free tax returns for. After outstanding work by Propublica, lawmakers and the IRS were finally pressured to create an IRS-based free filing service that would cut Intuit out of the loop. Intuit went on a lobbying blitz without parallel, giving out $3.5m in bribes in 2022 in a bid to kill the Treasury Department's study of a free filing service:
https://pluralistic.net/2023/02/20/turbotaxed/#counter-intuit
In 2022, nearly every US state attorney general settled their lawsuits against Intuit for the Turbotax ripoff, bringing in $141m:
https://www.agturbotaxsettlement.com/Home/portalid/0
In 2023, the FTC won a case against Intuit over the scam:
But Intut was undeterred. They came back in 2023 with a campaign to say that ripping off American tax-filers was antiracist and anyone who wanted the IRS to make filing free was, therefore, a racist:
https://pluralistic.net/2023/09/27/predatory-inclusion/#equal-opportunity-scammers
Strangely, no one bought that one. By May, 2023 the IRS had announced its own, in-house free file program:
https://pluralistic.net/2023/05/17/free-as-in-freefile/#tell-me-something-i-dont-know
Now, no one is forcing you to use this program. Do you have a family accountant that your grandparents started using in the Eisenhower administration? Just keep going to them. Do you like using Turbotax? Keep using it! Wanna do your own taxes? Here's the forms:
https://www.irs.gov/pub/irs-pdf/f1040s.pdf
But if you want to file your taxes for free, and you earn $84,000/year or less, here's the IRS's service:
https://www.irs.gov/filing/irs-free-file-do-your-taxes-for-free
Better use it quick, though. Elon Musk has just announced that he's killing it. Yeah, I know, no one elected him. That doesn't seem to matter to anyone, least of all Democrats on the Hill, who are still showing up for work every day and trying to engender a "spirit of comity" rather than screaming and throwing eggs:
https://apnews.com/article/irs-direct-file-musk-18f-6a4dc35a92f9f29c310721af53f58b16
Musk called IRS free file a "far left" program and announced that he had "deleted it." By the way, the median Trump voter's income is about $72k, meaning more than half of Trump voters qualified for free file:
https://fivethirtyeight.com/features/the-mythology-of-trumps-working-class-support/
(Image: Wcamp9, CC BY 4.0, modified)
CenturyLink nightmares: Users keep asking Ars for help with multi-month outages https://arstechnica.com/tech-policy/2025/02/centurylink-nightmares-users-keep-asking-ars-for-help-with-multi-month-outages/
Price of eggs rising faster than bitcoin in Trump’s America https://protos.com/price-of-eggs-rising-faster-than-bitcoin-in-trumps-america/
#20yrsago US government: Your fingerprints have expired! https://web.archive.org/web/20050213034201/https://fictioneer.blogspot.com/2005/02/government-to-immigrant-your.html
#15yrsago International Amateur Scanning League will rescue our video treasures! https://web.archive.org/web/20100213170425/http://radar.oreilly.com/2010/02/international-amateur-scanning.html
#15yrsago Interview with a Nigerian 419 scammer https://web.archive.org/web/20100130161655/http://www.scam-detectives.co.uk/blog/2010/01/22/interview-with-a-scammer-part-one/
#10yrsago Nathan Barley: old comedy turned out to be a documentary about our future https://www.theguardian.com/tv-and-radio/2015/feb/10/nathan-barley-charlie-brooker-east-london-comedy
#10yrsago Court has to a law’s diagram tortured sentence structure in order to rule https://www.loweringthebar.net/2015/02/tenth-circuit-forced-to-diagram-congressional-sentence.html">
Picks and Shovels with Ken Liu (Boston), Feb 14
https://brooklinebooksmith.com/event/2025-02-14/cory-doctorow-ken-liu-picks-and-shovels
Picks and Shovels with Yanis Varoufakis (Jacobin/virtual), Feb 15
https://www.youtube.com/watch?v=xkIDep7Z4LM
Picks and Shovels with Charlie Jane Anders (Menlo Park), Feb 17
https://www.keplers.org/upcoming-events-internal/cory-doctorow
Picks and Shovels with Wil Wheaton (Los Angeles), Feb 18
https://www.dieselbookstore.com/event/Cory-Doctorow-Wil-Wheaton-Author-signing
Picks and Shovels with Dan Savage (Seattle), Feb 19
https://www.eventbrite.com/e/cory-doctorow-with-dan-savage-picks-and-shovels-a-martin-hench-novel-tickets-1106741957989
Picks and Shovels at Another Story (Toronto), Feb 23
https://www.eventbrite.ca/e/picks-shovels-cory-doctorow-tickets-1219803217259
Ursula Franklin Lecture (Toronto), Feb 24
https://www.eventbrite.ca/e/2025-ursula-franklin-lecture-cory-doctorow-tickets-1218373831929
Picks and Shovels with John Hodgman (NYC), Feb 26
https://www.eventbrite.com/e/cory-doctorow-john-hodgman-picks-and-shovels-tickets-1131132841779
Picks and Shovels (Penn State), Feb 27
https://www.bellisario.psu.edu/assets/uploads/CoryDoctorow-Poster.pdf
Picks and Shovels at the Doylestown Bookshop (Doylestown, PA), Mar 1
https://www.eventbrite.com/e/cory-doctorow-picks-and-shovels-a-martin-hench-novel-tickets-1146230880419
Picks and Shovels at Red Emma's (Baltimore), Mar 2
https://redemmas.org/events/cory-doctorow-presents-picks-and-shovels/
Picks and Shovels with Matt Stoller (DC), Mar 4
https://www.loyaltybookstores.com/picksnshovels
Picks and Shovels with Lee Vinsel (Richmond, VA), Mar 5
https://fountainbookstore.com/events/1795820250305
With Great Power Came No Responsibility: How Enshittification Conquered the 21st Century and How We Can Overthrow It (Indiana University/virtual), Mar 7
https://events.iu.edu/mediaiub/event/1783095-with-great-power-came-no-responsibility-how-enshitti
Picks and Shovels at First Light Books (Austin), Mar 10
https://thethirdplace.is/event/cory-doctorow-picks-shovels-1
Picks and Shovels at Dark Delicacies (Burbank), Mar 13
https://www.darkdel.com/store/p3257/Thu%2C_Mar_13th_6_pm%3A_Pick_%26_Shovel%3A_A_Martin_Hench_Novel_HB.html#/
Cloudfest (Europa Park), Mar 17-20
https://cloudfest.link/
Picks and Shovels at Imagine! Belfast (Remote), Mar 24
https://www.eventbrite.co.uk/e/cory-doctorow-in-conversation-with-alan-meban-tickets-1106421399189
Picks and Shovels with Peter Sagal (Chicago), Apr 2
https://exileinbookville.com/events/44853
ABA Techshow (Chicago), Apr 3
https://www.techshow.com/
Picks and Shovels at Morgenstern (Bloomington), Apr 4
https://morgensternbooks.com/event/2025-04-04/author-event-cory-doctorow
Teardown 2025 (PDX), Jun 20-22
https://www.crowdsupply.com/teardown/portland-2025
DeepSouthCon63 (New Orleans), Oct 10-12, 2025
http://www.contraflowscifi.org/
The threat of big tech oligarchy and why the internet sucks (David Moscrop)
https://www.youtube.com/watch?v=s0Jfhn5wJ-o
Elon Musk's Digital Coup and the Future of the Internet (System Crash)
https://www.youtube.com/watch?v=zEZPa-YzaUs
"The Lost Cause:" a solarpunk novel of hope in the climate emergency, Tor Books (US), Head of Zeus (UK), November 2023 (http://lost-cause.org). Signed, personalized copies at Dark Delicacies (https://www.darkdel.com/store/p3007/Pre-Order_Signed_Copies%3A_The_Lost_Cause_HB.html#/)
"The Internet Con": A nonfiction book about interoperability and Big Tech (Verso) September 2023 (http://seizethemeansofcomputation.org). Signed copies at Book Soup (https://www.booksoup.com/book/9781804291245).
"Red Team Blues": "A grabby, compulsive thriller that will leave you knowing more about how the world works than you did before." Tor Books http://redteamblues.com. Signed copies at Dark Delicacies (US): and Forbidden Planet (UK): https://forbiddenplanet.com/385004-red-team-blues-signed-edition-hardcover/.
"Chokepoint Capitalism: How to Beat Big Tech, Tame Big Content, and Get Artists Paid, with Rebecca Giblin", on how to unrig the markets for creative labor, Beacon Press/Scribe 2022 https://chokepointcapitalism.com
"Attack Surface": The third Little Brother novel, a standalone technothriller for adults. The Washington Post called it "a political cyberthriller, vigorous, bold and savvy about the limits of revolution and resistance." Order signed, personalized copies from Dark Delicacies https://www.darkdel.com/store/p1840/Available_Now%3A_Attack_Surface.html
"How to Destroy Surveillance Capitalism": an anti-monopoly pamphlet analyzing the true harms of surveillance capitalism and proposing a solution. https://onezero.medium.com/how-to-destroy-surveillance-capitalism-8135e6744d59?sk=f6cd10e54e20a07d4c6d0f3ac011af6b) (signed copies: https://www.darkdel.com/store/p2024/Available_Now%3A__How_to_Destroy_Surveillance_Capitalism.html)
"Little Brother/Homeland": A reissue omnibus edition with a new introduction by Edward Snowden: https://us.macmillan.com/books/9781250774583; personalized/signed copies here: https://www.darkdel.com/store/p1750/July%3A__Little_Brother_%26_Homeland.html
"Poesy the Monster Slayer" a picture book about monsters, bedtime, gender, and kicking ass. Order here: https://us.macmillan.com/books/9781626723627. Get a personalized, signed copy here: https://www.darkdel.com/store/p2682/Corey_Doctorow%3A_Poesy_the_Monster_Slayer_HB.html#/.
Unauthorized Bread: a middle-grades graphic novel adapted from my novella about refugees, toasters and DRM, FirstSecond, 2026
Enshittification, Why Everything Suddenly Got Worse and What to Do About It (the graphic novel), Firstsecond, 2026
The Memex Method, Farrar, Straus, Giroux, 2026
Today's top sources:
Currently writing:
A Little Brother short story about DIY insulin PLANNING
Picks and Shovels, a Martin Hench noir thriller about the heroic era of the PC. FORTHCOMING TOR BOOKS FEB 2025
Latest podcast: MLMs are the mirror-world version of community organizing https://craphound.com/overclocked/2025/02/09/mlms-are-the-mirror-world-version-of-community-organizing/
This work – excluding any serialized fiction – is licensed under a Creative Commons Attribution 4.0 license. That means you can use it any way you like, including commercially, provided that you attribute it to me, Cory Doctorow, and include a link to pluralistic.net.
https://creativecommons.org/licenses/by/4.0/
Quotations and images are not included in this license; they are included either under a limitation or exception to copyright, or on the basis of a separate license. Please exercise caution.
Blog (no ads, tracking, or data-collection):
Newsletter (no ads, tracking, or data-collection):
https://pluralistic.net/plura-list
Mastodon (no ads, tracking, or data-collection):
Medium (no ads, paywalled):
Twitter (mass-scale, unrestricted, third-party surveillance and advertising):
Tumblr (mass-scale, unrestricted, third-party surveillance and advertising):
https://mostlysignssomeportents.tumblr.com/tagged/pluralistic
"When life gives you SARS, you make sarsaparilla" -Joey "Accordion Guy" DeVilla
In a game focused on exploration, we need to have a way to decide what the player can and cannot see. With a first-person or third-person 3D view, visibility comes for free: anything below the horizon, and anything behind something else, is not visible. With a top-down perspective, we need to do some more work.
Here’s the test scene I’ll be using:
Please ignore the black areas in the corners, which are due to a bug in my camera calculations that I’ll fix later. The player’s ship is in the centre, but it’s so small it’s barely visible. I’ll have to look into that later too.
The simplest way to restrict visibility is to limit the camera zoom, so you simply cannot zoom out farther to see more. Since the view is always centered on the player’s ship, this effectively limits how far they can see. However, because the screen is typically not square, this restricts visibility more in the vertical direction than the horizontal; also, the amount of restriction would depend on the screen aspect ratio. Rather, we want some kind of circular visibility.
Of course, this is not a new problem, and games have been solving it at least since the early real-time strategy games like Dune II:
The areas not visible to the player are simply rendered in black here. More precisely, in RTS games, areas unexplored by the player are rendered in black, but that’s not what I want for my game. I’ll just make everything black that isn’t currently visible.
Thanks to my newly gained experience with Godot’s compositor effects, it’s relatively simple to implement this:
However, this restricts visibility to a circle at the horizon. Wouldn’t it be fun if, just like in a first-person view, things farther away than the horizon would also be visible, if they were tall enough, such as mountains and volcanic islands? So let’s add that! Using some vector algebra and trigonometry, it’s not too hard to compute whether a given point is above or below the horizon:
And… that’s cool, but there is a problem. The calculations are based on a typical carrack of the time, with a lookout standing in the crow’s nest 25 meters above the water level. The equation to compute the distance to the horizon is easy to derive using Pythagoras’s theorem:
d = sqrt(2*R*h + h²)
where R
is the radius of the planet and h
is the height above sea level. (If h
is small relative to R
, we can omit the h²
term.) With our planet being 1% the size of Earth, this puts the horizon at 1.8 km away. That is, we can see the surface of the ocean up to 1.8 km away, but may be able to see taller things beyond that.
Let’s say that a mountain can be at most 10 km tall. (The tallest mountain on Earth, Mount Everest, stands at 8.8 km.) The tip of such a mountain would then be visible from almost 39 km away! That means it would be very far outside our little horizon circle; you’d have to zoom out until the horizon circle occupies only a fraction of the screen, before you can spot that mountain in the distance. And the game would have to generate and render terrain at huge distances to make this work.
Fortunately, when scaling down the planet to 1% the size of Earth, I also scaled terrain heights down to 10% size. (This should of course also have been 1%, but that would reduce mountains to mere hills in comparison to the ship.) So actually, our mountain is at most 1000 meters tall. This makes it visible from about 13 km away, which is already better, but still a lot compared to our 1.8 km horizon radius.
So it’s time to cheat, and scale down terrain height only in the visibilty calculations. But by what factor? Let’s take it from the other side: from how far away do we want a 1 km tall mountain to be visible? We can make this value configurable directly for the game developer (me), and have the game code figure out the right scale factor. If we set the distance to a reasonable 5 km, that works out to a maximum mountain height of 81 meters, i.e. a scale factor of 0.081 on top of the 10% we have already:
That seems a bit too tame, but maybe these mountains aren’t actually very tall; I haven’t checked. At least now I have a meaningful value to adjust. (And in the remainder of this post, you’ll see that I did adjust it.)
Even though you can’t go far wrong with black, it looks rather dull, especially if a large portion of the screen is filled with it. In reality, you would see the sky above the horizon. So how about using the sky colour instead of black?
As it happens, during my excursion down the third-person rabbit hole in the last post, I wrote some shader code to compute sky colours and atmospheric scattering. So I can reuse that here – it hasn’t been in vain after all!
First, let’s compute the sky colour just above the horizon, and use that instead of black. This is a good opportunity to show off the day-night cycle as well, which has been in the game forever:
I think this looks fairly good, but I also added some debanding noise after capturing this video.
(You’ll notice that it’s not completely dark before sunrise and after sunset. That’s because there is also a moon, which is currently always full, and whose position is not overridden by the debug controls.)
When looking at a distant object, such as a mountain, you’ll notice that it appears blueish and washed out. This phenomenon is called, somewhat confusingly, aerial perspective. The washing out happens because the light from the mountain is partially absorbed and scattered before it reaches your eye. The blueish tint is caused by light from the sun being scattered into the line of sight; light towards the blue end of the spectrum is more prone to scattering.
Godot has fake aerial perspective built in, but the problem is that it works from the point of view of the camera, where we want it to work from the point of view of the ship. There’s no way to override that, so we have to build our own. Fortunately, with the sky shader, all the pieces are already there, so it’s just a few lines of code:
It adds a nice sense of distance. Because of the 1% planet scale, I had to strengthen the effect by a factor of 30 to make it look like this. Realistically that should have been a factor of 100, but that turns out to be too strong.
Godot offers shadow rendering by default, but I had to tweak the values quite a bit to make them work well at these scales. They make a big difference:
To wrap things up, here’s a video of exploring a bay in the early morning:
I do see some problems already, such as that the aerial perspective effect is also applied at full strength in shadows, where inscattering should be less. I’m not yet sure how to fix that, so I’ll leave it at this for now. The game still looks better now than it did this morning!
Work on the game has been slow due to lack of time, but worse, I’ve let it drift off in the wrong direction. I need to be more careful about scope creep. But first, let’s talk about the progress that I am happy about.
In a previous post, I described my newly written entity-component system (ECS). I had some reasonable arguments for structuring the game as an ECS, most notably that it’s not possible to add functionality to all Godot nodes, regardless of type. You’d need multiple inheritance for this.
I did mention the workaround of putting the functionality onto a child node, but that’s a fairly heavyweight solution. However, does that matter in practice? Perhaps not. Moreover, I’ve found another approach: put the nodes into a group, and have a separate node that applies some operation to all nodes in that group. This only works for code, not for data, but it turns out I don’t need that – I initially wanted to store a 64-bits node position, but storing a 32-bits position relative to the floating origin works just as well.
And my ECS came with a number of drawbacks. Adding new node-based entities now required me to add code in five different files:
This was just way too cumbersome, for little benefit. With Godot’s node-based approach, I’d only need the scene and a script, and the link between the two is automatically managed.
And because all the values were tucked away in components and systems in pure C# code, I lost the ability to modify scene properties in the editor while the game is running. I didn’t realize at the time how useful that feature is.
Another argument for using an ECS was that it would simplify the implementation of saving and (especially) loading. I now realize that isn’t true. Yes, streaming a bunch of components from disk using some predefined serialization format is easier than reconstructing a scene tree from (e.g.) a JSON object. However, after loading those components, I’d still need to reconstruct that scene tree anyway. The work is just moved into the systems that update nodes from components, but it still needs to be implemented.
So in the end, I decided to throw out the ECS and port everything back to a more customary Godot scene tree. Everything? No, there were a few key benefits that I wanted to keep.
First, there’s dependency injection. Systems allowed injecting resources and queries into their constructors, and this really helped to decouple the code. Since we’re now doing nodes again, I wrote a very simple Injector
node, which is triggered whenever a node is added to the scene tree. It uses reflection to scan the new node for any fields with the [Inject]
annotation, and provides them with values from an array of pre-registered injectable objects. Functionally it’s almost the same as Godot’s singletons, but makes the dependency more explicit, which I like.
Second, there’s the global event bus. I haven’t actually implemented this yet, and maybe I won’t need to, but I’m definitely keeping it in mind.
Earlier this year, Rune Skovbo Johansen alias runevision released LayerProcGen, a principled framework for procedural generation of infinite worlds. My worlds aren’t infinite, but they are big enough that we can’t generate and store them in their entirety, so the same principles apply. The idea is that procedural generation happens in layers, and each layer is generated in chunks as usual. Each layer can only depend on the layers below it, but it can request chunks from a larger area so that it has some context to work with:
Source: the LayerProcGen documentation, licensed under MPL 2.0
I realized that I was essentially already doing some of the things LayerProcGen helps with, but in a more ad-hoc way. So I decided it would make sense to switch over to this framework. I couldn’t use Rune’s code directly even though it’s in C#, because it assumes a flat world (curse those spheres!). Fortunately, the implementation isn’t rocket science so I just wrote my own.
I now have layer-by-layer, chunk-by-chunk generation working, distributing the work over several threads to speed it up. The game looks exactly the same as before, but the code is better organized and easier to build on top of.
Around this point, I got sidetracked a bit.
I’d previously settled on 3D rendering, but with a mostly top-down camera:
On a whim, with my newly found powers of editing the scene tree while the game is running, I moved the camera away from the top-down perspective, and put it in a third-person perspective behind the ship. And it looked… rather nice. Oh dear.
This screenshot doesn’t even have any land in it, but it already has a much more immersive feel than the top-down view. It would also add interesting gameplay elements, such as distant coasts actually being less clearly visible, and having to do more work to match your surroundings to a map chart.
I figured that we have a full 3D scene already, so it shouldn’t be too much work to use this perspective instead of top-down, right? So down the rabbit hole I went, not realizing how deep it was.
We can now see the sky, and it looks rather drab and boring – not even properly blue. Indeed, that’s the best you can get with Godot out of the box, so I had to write my own sky shader. I did that, implementing a pretty standard path tracer with single scattering, which made sunsets about 100× prettier:
It automatically works with moonlight too:
Okay, we now have some atmospheric scattering going on, but it’s only applied to the sky and not to any other objects:
The faraway islands are still a harsh green, rather than fading into the distance. Fortunately, Godot has a checkbox to add some fake aerial perspective, which mimics the effect of light scattering into the ray between the camera and the distant mountains. This instantly made the scene look much better:
The effect is fake and might be limiting later once I start adding fog, but it’s better than nothing. Good enough for now.
With this new third-person perspective, the low-poly ship model contrasted weirdly with the highly detailed waves and terrain. I already had a solution in mind for that. There weren’t any cameras in the Age of Sail, so much of what we know has come to us in the form of paintings. So wouldn’t it be cool if the game looked like an oil painting as well? I’d been planning to apply a post-processing filter to do just that.
I looked around in the literature, and found that there are essentially two ways to make images look like paintings. On the one hand, there are filters that modify the image in some clever way, so that the result looks somewhat like brush strokes. On the other hand, some techniques create and render actual brush strokes. The challenge with that is temporal coherence: we render 60 frames per second, but we don’t want entirely new brush strokes to appear every frame, because that would cause way too much flicker.
The anisotropic Kuwahara filter by Kyprianidis et al. is of the first category, which is easier to implement, so I wanted to try it first:
This YouTube video shows an implementation in TouchDesigner, applied to some drone videos of a mountainous landscape, and it looks absolutely gorgeous (skip ahead to 1 minute):
I got most of the way through implementing this, using Godot’s new compositor effects and compute shaders, when I realized that this rabbit hole was too deep. The somewhat naive, but still GPU-based implementation was taking almost 100 milliseconds per frame; I’d need to get it down to 3-4 ms to still run smoothly on older hardware. And some bug was causing it to look more like a bad JPEG than a painting.
And even if I fixed the bugs and somehow made it 30 times faster, this filter might still not achieve the look I’m looking for, because the input doesn’t have nearly as much detail as a photograph or video. This filter is essentially about removing detail, where I might be better off with something that adds detail instead, i.e. individual brush strokes, with all the temporal stability issues that that entails.
Alice felt that she was dozing off, when suddenly, thump! thump! down she came upon a heap of sticks and dry leaves, and the fall was over.
At this point, I belatedly realized that going full 3D wasn’t as quick and easy as I’d originally estimated. The sky shader needed more work to get rid of the green horizon. The painterly rendering was the kind of stuff that academics build entire careers on, but without it, I’d need to craft more detailed models. And on top of all that, I only have clear skies so far – I haven’t even begun to implement real-time, dynamically changing clouds yet.
It was becoming clear that I would have to cut scope, and put the camera back where I had planned it. I even considered going to 2D entirely, but that would essentially mean starting from scratch and wasting even more time. Instead, let’s climb back out of this hole to the surface, take a breath of fresh air, and forge ahead with what I’ve got already.