I’ve had a Raspberry Pi 4B sitting in my cabinet for a few months now. I dusted it off and realized that the SD card was busted. Got a replacement 64GB U3 A2 card and got it up and running with Ubuntu server. The primary intended use was to run docker with DB containers that I use for my side projects such as Postgres/MySQL/MongoDB.
My current home network consists of several routers for WiFi reachability.
Design systems give your budding project a jump start and more importantly a structure when the project continues to grow. These systems bring in a level of sophistication of thinking and uniformity. Their value lies beyond pre-made CSS/JS assets. Identifying the right design system in the initial phases is crucial for progress. I employ various metrics to pick one such as:
Community support & acceptance Documentation a11y/i18n/l10n Component library Commit rate Backers However when it is time to play aka.
Introduction Moving to or starting with AWS (or any cloud provider) comes with an implicit assumption that your business will pay what it uses. Although technically true, most businesses ignore the human aspect. More often than not, developers will make assumptions while allocating resources and end up with:
Overprovisioned resources Unused resources In this article, based on our experiences and AWS events/whitepapers1, we will outline a few approaches to combat the ramifications of these decisions at any stage of product growth.
As a development team matures and moves across various stages of code organization and system design, the deployment layout also adapts. Change can be a function of product growth, team size changes, technology decisions, or a combination.
General progression is from a monolith to handful of homogenous microservices. As the product gets diverse and team size grows, these microservices become heterogeneous. They can use different languages, servers, API end points, etc.
Mac! I absolutely love MacOS for development. It gives me the power of Unix with a whole lot of convenience. Can it be replaced with Unix? YES! Do I want to? No. As a full stack developer and CTO of my company, I spend 80% of my day in three apps:
Shell/Terminal IDE Browser These three are the same no matter which OS I use. I see MacOS simply as a shell for my Unix environment.
In a 2009 talk, Tony Hoare traced the invention of the null pointer to his design of the Algol W language and called it a “mistake”:
I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler.