Easily one of the most irritating things about WordPress is the continuous attempts to up-sell in the dashboard. Here’s how to get rid of them.
It’s one thing to have a gentle hint, like Updraft Plus saying “hey you can get more features with the pro version” and a dismiss function that makes the reminder go away for a year. It’s more irritating to have something that pops on every login, and even more irritating to see something show up every time you are on the dashboard.
I can tolerate most of these. Sometimes they’re irritating enough that I just switch plugins, especially when they try pushing related plugins you don’t give a damn about. This is one reason why there’s no Yoast on my sites. The SEO Framework is just as capable, a lot less pedantic, and it doesn’t nag. Yet.
But what to do when there’s no alternative and the plugin is persistently nagging? Use your ad blocker to make it go away! Since it has just recently pissed me off, I’m going to use WPCode Lite, an otherwise useful plugin as the prime example. With a recent update, this plugin has elected to add a widget to the post edit page with the title “WPCode Page Scripts”. Cool! That’s a welcome minor convenience. Click on the widget though and what do you get?
Page Scripts is a Pro Feature
That’s nice, but for me it’s not worth subscribing. How do I dismiss this? Surprise, surprise, I can’t. Well, F*ck you, WPCode! I’ll do it myself. Now this takes a little knowledge of HTML and custom AdBlock Plus rules, but the general process goes like this:
Right click on the widget and choose inspect. This will highlight the general area in the code you want.
Look for the highest level element that uniquely identifies your target widget. In my case, inspect took me to a div with the id “advanced-sortables”. We don’t want that. There are a bunch of useful elements on the page that are enclosed by that. Drill down a bit and we find another div with the id “wpcode-metabox-snippets”. That’s the one!
Click on your AdBlock icon.
Click on the gear to get to settings.
Select the advanced tab from the left.
Scroll down to “Your Custom Filters”.
In the box below “My Filter List”, enter this rule, substituting your domain for mine:
ambitonline.com###wpcode-metabox-snippets
You can also leave off the domain if you have multiple sites, but in my experience more specific rules help prevent long periods of “WTF” until your realize that the ad blocker is doing something you didn’t expect.
I’m picking on WPCode here because the arrogance of this has really ticked me off, but you can apply the same method to most other notices like this. At some point, plugin developers will take defensive measures to make this even more difficult and then we’ll have a discussion about writing user scripts in TamperMonkey and it’s kin.
Anyone writing code has probably seen a bunch of references to Docker by now. It’s like this new toy that’s all the rage, but for people like me — where picking up new things takes a crap-load more work than it used to — the general reaction is “I’ll learn that when it’s unavoidable.” Alas, a few weeks back it became unavoidable, and I’m here to report back.
If you’re even mildly inquisitive, a quick scan of Docker says a lot about containers and it’s pretty obvious that a container is some kind of virtual environment, but there’s not much that tells you why you should give a damn. I’m going to try to fix that. I’m not going to describe how to use Docker here, just what it’s for. There’s lots of great resources on making use of it elsewhere.
If you got into development in a time when your workplace had a nice air-conditioned room with raised floors and racks of servers and keyboard/console switches, then this post is for you.
The TL:DR on this is the learning curve here is a lot less than it seems at first, and the flexibility it gives you to set up and change configurations is truly powerful. Invest a few days in learning the basics and you can build, rebuild, and reconfigure virtual server rooms that significantly reduce the amount of time needed to maintain your own local environment. As a common example, if you’ve ever sworn at LAMP/ XAMP / MAMP configurations only to start from scratch or if you’ve tried to get two versions of just about anything running on the same system, then Docker definitely is for you.
I’ve got this old, slightly torn and patched 100 dollar bill up on a desk shelf where I can see it. A few months back we found it between some papers and I didn’t have the slightest recollection of how I got it.
But now it’s coming back to me. I think a friend and business partner gave it to me to acknowledge the work I was putting into our project. I hope that externally I was suitably grateful, appreciative, and said “that’s not necessary”, all of which would have been true.
The project, and the acknowledgement, came at a time when my extended period of “mild to moderate” depression was well into “moderate”, but likely before I truly realized what was going on. I might have put all I had into that project, but what I had wasn’t much. My effort yielded a bunch of prototypes and some ambitious code that never saw completion. I might have been doing my best at the time, but I was all too aware that my best was a fraction of what I had been capable of in the past.
My internal dialogue compounded my funk. Was I now just too old to write good code? Was the passion I’d had since I was a kid just done with me? If so what next? I certainly didn’t think my efforts were worth much, certainly not $100. After launch and the first sales, now that’s worth a pat on the back and a nice dinner! Thrashing at a solution with nothing of production quality… not so much. No matter how sincere the effort, effort without results is difficult to distinguish from no effort at all.
No, this was an undeserved reward. Another testament to my failure to perform, something else to highlight the pervasive feeling that I had: I was afloat on a large body of still water in a deep fog, with a pair of good oars but no idea of where I was, where I was going, how far it might be, or in which direction I might proceed to find anything. There I sat, adrift. I might row from time to time, but it was never clear if the effort was pointless or not, if my limited transit was changing anything or just Brownian exploration. I am sure I stuffed that $100 someplace where I knew the chances I’d run across it again were slim, where it couldn’t remind me how adrift I was.
That was almost 20 years ago now. I had no way of knowing that I’d be in that ugly fog for more than 15 years, although with thankfully few bouts of moderate during its course.
Now I sit here, three or so years clear of the battle. [There is no way for me to say “there, that’s when I beat depression!” I just gain increasing confidence that it’s not coming back anytime soon.] I look at this old bill, not even knowing if it’s still a negotiable instrument, wondering if I should try to deposit it, frame it as a reminder that sooner or later if I simply persist it is possible to be free, or just slip it between a few papers and see if I rediscover it on a much later purge.
Whatever I do, there is no way to describe how I feel knowing that I am looking at this ugly beast in the proverbial rear view mirror.
There’s a mouthful. There’s two parts to this post. The first part is the story of how I arrived at using Behat, a tool that facilitates Behavioural Driven Design (BDD). The last part is the TL;DR configuration I used to get it working. If you just want to get this configuration going and don’t care about the background, skip down to it. I won’t be hurt.
First an admission: I’m really, really late to the BDD camp, and it kind of pisses me off. If I’d been using this approach for the past 15 years, there’s no question I would have gotten more done in less time.
Is a Test-Driven Design approach effective, even when working on a medium scale personal project? Yes, and here’s why.
I’ve long been an advocate of Unit Testing in software development, having found that even though the work is often tedious, the investment in good tests significantly reduces overall development time. In my experience, the time invested in unit testing pays back by a factor of ten or more.
For my latest as-yet-unreleased project, I decided to try a Test-Driven Design approach. In short, the TDD method requires that you write tests for your code before writing the code. This has some significant benefits from a design/architecture perspective:
It forces the developer to think more about the external interface for a package and less about the implementation details.
It requires a more precise definition of what the inputs and outputs of a functional block are. This can expose flaws in the overall architecture. In the case of my project, I was working on the principle that the module would take one input file and generate an output. While creating the third test case, I realized that I was duplicating content from one input file to the next, that a far better approach would be to break that information into a second, common file that could be used multiple times.
It assures near 100% code coverage for tests. In theory it guarantees 100% code coverage. In practice that’s more difficult. More on that later.
Writing a test for everything forces you to to become a user of your own code. This serves to highlight problems early, before they become deeply embedded in code. When designing something, it’s easy to oversimplify. The only easy part of the “that part will be easy” trap is falling into it. In this project my “easy” trap required no less than three refactorings and a partial re-write before it was actually easy for the user. If I was coding first and testing later, backing out of the initial design would have been much more difficult and required throwing out a significant amount of code. Code is time.
It reduces useless code. I have a distaste for missing functionality. This means that when I write a method to do something, I’m inclined to generate similar methods that do parallel things at the same time. TDD puts an end to that.
It highlights common functionality that may not have been evident in the requirements phase. This makes it easier to spin out that functionality into independent classes early in the implementation.
Software Engineering Purity and Test Structure
Although I’m a strong advocate of automated testing, it turns out I’m far from a purist. Even though there are some clear benefits, I’m not likely to build a suite of mock classes just to isolate a subject class. 100% coverage is always a good thing, but I’ll take 80% coverage, knowing that some abstract class I don’t have an explicit test for got a pretty thorough run through by other tests, metrics be damned.
If tests require some expensive operation to complete, such as accessing a database, then creating a data provider is worth the effort, but doing so purely for the sake of correctness? Not so much.
It’s a little too easy to write code for something obvious, like checking for a null value, without writing a corresponding test. Sticking to Test-Driven Design requires a level of discipline that’s difficult to maintain, particularly when you’re the only developer on a project. Because of this my test suites tend to be a mix of unit tests, integrated tests, and some kind of ugly hybrid between the two, and I’ve decided that I don’t really care. Generating sample output in a bastardized test that ends with assertTrue(true) is still useful. Even though that test always passes, every once in a while the “real world example” test exercises an unexpected pathway and throws an error that would otherwise sneak by. I’ll take that find over purism ten times out of ten.
TDD and Over-engineering
I’m also more relaxed about software engineering principles when it comes to testing. I’m more likely to copy and paste test code than I am to carefully craft a hierarchy of test classes. I may be more relaxed but I’m not lax… as it turns out this project has a bunch of test cases that are common across multiple classes. I initially just copied the first class, with all the cases, which are fairly elaborate. Then I needed to copy them again for a third class. Three is the magic number when you realize you just screwed up. It took a fair bit of effort to go back and decouple the test case generation from the expected results, but there is an immediate payback: tests that exercise more sophisticated features in the latter classes are now automatically passed to their predecessors. If a simpler class can’t handle the new case in a reasonable way, it’s evident immediately.
While I have clearly strayed form a pure Test-Driven Design methodology, starting out with TDD gave my project an obvious lift.
Reducing gold plating and improving the design
On more than one occasion I found that I was inclined to embellish code with things like getters and setters that looked useful but actually had no use case. All the “I should add this” moments were converted to “I need to write a test for this first” and it’s not long before you realize that you don’t need a test because nothing will ever need to use the method in this way. Better yet, it makes you think about how that task would ever be used. The end result of this was twofold.
First, entirely removing functionality that seemed like a good idea early in the design process but in reality was just useless baggage. Second, if the functionality was useful, it was usually in the wrong place. This led to a series of code refactors that extracted that functionality into a conceptually clean solution, either a trait or a stand-alone class, in either case useful in many places. Less code or better code. Both excellent benefits of starting with TDD.
TDD Offers Measurable Progress
When working on a project of significant complexity, particularly when working alone, it’s easy to lose track of where you are. That’s not a problem if you have a client deadline looming in the not too distant future, but when it’s a personal project, and particularly if you’re blessed with a healthy dose of ADD, it’s easy to lose momentum.
For me, loss of momentum is the death of a project. I’ve got a long list of unfinished projects that I thought would take a few weeks when I started, but in fact they needed many months. Nearly all of them died from a momentum deficit.
Test-Driven design, with it’s focus on top level functionality, really helps with that. Even though my current project is perhaps 50% complete, it’s generating useful results. The implementation is partial, but it’s functional, and needless to say it’s tested, well structured, and robust. Instead of substantially complete code that winds up with significant problems when put to actual use, I have working but functionally incomplete code that I expect will be a joy to keep working on. All of these things are giving me enthusiasm for implementing the next level of features.
Conclusions
Even though I have strayed form the TDD methodology as the project progressed, starting with TDD was the best thing I’ve ever done when working on something of significant size.
If I was working in a team, not only would I start with TDD, I’d be far more strict about sticking to it throughout the development cycle. It highlights architectural issues very early in the development process, when it’s far easier to adjust and fix the problem. A dozen minor restructurings when the code base is small is a thousand times easier than rewriting thousands of lines of code after the mistakes have been baked into the project.
It’s hardly an objective measure, but I think this code represents some of the best work I’ve done. My three week project has extended to five months so far, but I’m still excited about it. Best of all it’s still fun. It also has the unexpected benefit of spinning off Configurable, which has proven itself to be very useful in other projects (not to mention that it’s just cool, IMO).
What I particularly like is the days when I’m standing in front of all the bits required to make coffee and there’s this internal dialogue:
L: Okay, make coffee now. R: Wut? L: MAKE COFFEE!! R: How? L: Oh FFS all the bits are there, just DO IT!! R: Something about this round thing, right? L: Aughh!! WAKE THE F*CK UP! R: I need coffee. L: Yes, YES! You need to MAKE it, you do this every f*cking day! R: Ugh, ok, right. Push grind button. L: There we go! F*ck me. R: Water. L: Yes, yes, you’ve got this!
As we recover from yet another mass killing we hear a lot of smart people saying that the rise of hate is driven by a fear of loss of power. But they don’t clearly identify where that fear comes from.
At the same time, even though we might not be aware of it most of us already know where it comes from on a visceral level. We worry about our debt load, about what kind of world our children and grandchildren will grow up in, about the environment, about our jobs, our pensions, and ironically about the rise of hate.
Humans are a competitive species. Our political systems are a gossamer barrier between modern civilization and tribalism. It’s far too easy to transfer our anxieties into an identification of some “other” that we can blame for our worries. It is a dangerously small step from resentment to hatred, and from there to violence.
What few seem to notice, or at least to explicitly identify, is the correlation between the confidence of the middle class and the strength of social liberalism. If most people think the future will be bright and there will be more than enough prosperity to go around, suddenly we’re a lot less concerned about differences in race, religion, gender, and so on.
It is a cruel irony that this social conservatism has a affinity for autocratic politicians, precisely the type that are going to ensure that the wealth gap increases. The real way to alleviate social anxiety is to vote for a party that will actually work to redistribute wealth. Instead of gravitating towards populists the anxious middle class should be gravitating to socialism.
If I were a conspiracy theorist, I’d assert that the Illuminati (or whoever) are manipulating society to achieve this result, but I just can’t see that. I think it’s just our base human nature. We all have a responsibility to fight these instincts, for nothing good can come from the alternative.
UPDATED 2019-07-22: Netbeans 11.1, released today, incorporates a robust fix for this problem that should survive future changes in PHPUnit.
TL/DR: There’s a one line patch to PHPUnit below that will kludge the kludge and get you running again.
Anyone who has worked with PHPUnit for some time knows that backwards compatibility isn’t exactly a prime consideration. Meanwhile, although Netbeans currently has very good support for PHP, you have to figure that the intersection set between the Java developers working on the project and the people who figure that PHP is anything but a toy language for building simple projects is, well… small. [We’ll just ignore the fact that Facebook, the largest application on the planet, is written in PHP].
So when Netbeans says it offers support for PHPUnit 3.4.0 or higher, it’s okay to expect the integration to be out of date. Rather surprisingly, it’s actually worked right up to PHPUnit 8.1.
But now we have 8.2. Command line parsing has been made more robust, and the extra parameter Netbeans passes in to a kludged custom test when running a single file doesn’t work anymore. [BTW I don’t blame the Netbeans developers for this kludge, it was probably the only solution that worked back in 3.4.0.]
This makes the current Netbeans approach outdated and unworkable. Like many open source projects, this means someone who cares has to go in and do some significant re-work. Don’t hold your breath. I’d give it a go but my Java foo is about 20 years old now. I probably don’t know enough any more to even be dangerous. I think I know a good architectural approach but attempting to implement it would be a recipe for failure.
So what to do? Patch PHPUnit! This is a pain since the patch will have to be reapplied every time PHPUnit is updated, which is frequently. But at least it works.
So here’s the kludge: in the file TextUI/Command.php in the handleArguments array, just change the line that reads
if (isset($this->options[1][1])) {
to
if (isset($this->options[1][1]) && substr($this->options[1][1], 0, 6) != '--run=') {
This ignores the Netbeans-generated argument and everything works as before. Not pretty but it works.
For more information (and a possible fix), follow the Netbeans Issue.
The video below has been around for a few years. It’s a great clip. The issue I have with it is that I think it supports people who already understand the scientific process, but it fails to fully explain the distinction between the casual and scientific uses of “theory”.
There is a critical distinction between a theory and a Scientific Theory. For example, I have a theory that a significant proportion of people react to headlines without digging into the content behind that headline.
Theory vs Hypothesis
Colloquially, that’s a theory. Scientifically, it’s a Hypothesis. If I do a study to test this Hypothesis and I get data that confirms it, then it’s a Valid Hypothesis. It’s still not a Theory. [Note that if I indeed were to do such a study and publish it, it’s likely to be reported as “Study Proves People Only Read Headlines”. That headline is bullshit. Which is why just reading headlines is dangerous.]
Now if a bunch of other people also do studies and get the same result, then it’s on its way to becoming a Theory. That’s where Evolutionary Theory is now. There are hundreds of thousands of experiments that not only prove that evolution is a fact, there are millions that depend on it being a fact.
Science is Just Our Best Guess at Fact
Now it is true that sometimes science gets it wrong. Especially in life sciences. The attack on dietary fat that we’ve seen for the past 40 years or so is a prime example, but that’s starting to be corrected with new research. This is how the scientific method works. Science is still done by people, and people can get things wrong. Frequently science self-corrects fairly quickly, but sometimes it takes quite some time. When it’s wrong the contradictions eventually get exposed, and those contradictions lead to more focused research. Eventually the right answer emerges, even if it means contradicting previous conclusions. Evolutionary Theory has been around for over 150 years, and nobody has managed to invalidate it yet… and I’m sure many have tried. That’s the point the video makes well: there’s an overwhelming body of work that proves evolution is real. This proof is well beyond reasonable doubt.
So when you here the word “theory”, ask yourself which meaning the person is using. Is it “I have a theory that someone steals one of each pair of my socks” theory, or is it “Gravitational Theory says that gravitational attraction is proportional to the square of the distance between two massive bodies” theory. It’s a pretty important distinction.
Readers of this blog will know that I’m a huge fan of open source. Be it software, designs, engineering, etc. There’s a huge body of work that I believe benefits from the open source movement. That belief is predicated around freedom. I strongly believe that people who use products should have the ability to control their destiny after they acquire a product, and the best way to do that is to give them the tools to recreate and modify the product.
But that doesn’t mean everything should be free. While it’s true that I have contributed to collaborative open source projects that give the code away, it is not an entirely altruistic endeavour, for at the same time I’ve taken advantage of similar efforts by hundreds of others to build things that I never could have built alone. The key thing here is that these projects are collaborative works where all the participants in a community benefit.
Individual creative works are another thing entirely. There’s no similar multiplier that gives a creator back a multiple of what they’ve contributed. Someone who illegally downloads a book by a small author isn’t gaining any freedom, they’re just getting a product for free. If you download one of my Creative Commons licensed low resolution photographs and use it to print a crappy large format print, you deserve to waste your money. Book authors don’t enjoy that ability to constrain the clarity of digital versions of their work.
At some point, as individual labour stops being the way most of us add value to the economy, we’ll have to transition to some form of guaranteed minimum income scheme. At that point, there might be a rationalization that goes along the line of “this author is already receiving enough to get by, so they’re getting enough”. While I can’t say I agree with that position, at least someone who is passionate about creating has the knowledge that they won’t starve to death in the process. But neither will they live comfortably, nor will they receive the value that others derive from reading their work. A survivable system still isn’t a sustainable system. But this is futurism. We’re not there yet, and until then, taking advantage of someone who needs the money to keep doing what they’re doing is outright theft.
Recent Comments