• The many levels of newbieness, writen by a newb

    04/09/2021 at 13:39 0 comments

    (Edit: I found a forum question that already asked this: https://softwareengineering.stackexchange.com/questions/256833/why-dont-developers-make-installation-wizards-on-linux )

    My favorite thing about the Raspberry Pi Zero is the cost. Since its release in 2015, the Pi Zero has allowed beginners and experts alike to harness computing power without needing a whole lot of other hardware.

    The purpose of the Raspberry Pi is and always remains education. The purpose of education and enterprise-industrial applications are quite different, if not polar opposites. So I try to remind myself that when something isn't working as I'd like, I think that its purpose is for learning how computers work, rather than the purchase being for a warranty of some type of software support.

    To describe this "dichotomy," let's first understand that economies of scale is what led to the Raspberry Pi becoming so inexpensive. The intent of the Rpi developers is to lower the entry to affording a computer, and its intention is not to support all the most advanced features.

    I think the meaning of basic computing should be explored. What is a basic computer? One that provides internet and has a common, but modern display cable, such as HDMI. These are basic computing features that are used by a vast majority of users, rather than being basic computing features in an early desktop computer from the 1980s. Back then, features that were considered basic computing would include Office suites, (printing), and intranet. 

    This is in no way a criticism of Rpi. In fact, I am immensely grateful for their disruptive technology. I own 1 Raspberry Pi 3B+ and 2 Raspberry Pi Zeros (the 2nd 0 I bought just because). However, I hope to find a use for all three- I use one of the zeros and the 3B+ frequently to test and benchmark the performance of various operating systems, particularly ones that boot from RAM, such as Puppy Linux, Diet Pi, PiCore and other x86 ports.

    The short explanation for this is I am interested in extending the mission of the Rpi by cataloguing the OSs other than Raspbian with the most Raspberry Pi support, and then determining which ones can boot from RAM. Of those, the ones that can run in RAM and run a select number of applications, which could then be optimized to run with or without a traditional suite of operating system apps so that the performance utilize the full amount of RAM, whether it is 512MB, 1GB, or 2GB. In this way, the educational mission of the Rpi can utilize another function of the Rpi software- the less often used initrd or intramfs that runs the entire OS in a high speed memory. With the proliferation of many carrier boards that are natively supporting NVMe, it is certainly encouraging to see enhanced performance of the RaspberryPi Compute Module 4 by utilizing the hardware that it already has. However, if the original mission of the Raspberry Pi was to educate, then a high speed native boot drive or PCI-express capability could be considered goal displacement. Which again, isn't necessarily a bad thing. 

    The definition of the Raspberry Pi "educational tool" could be re-examined a little further by determining the number of eras that a tool is supposed to help with. Is it supposed to only teach modern operating systems, or obsolete/outdated ones? A podcast last month reflects on software development of the 1990s with a much more critical eye. 11:30-16:30 talks about software efficiency and the amount of RAM required today is much greater than the earlier OSes, and efficiency has been lost.

    After listening to this, I thought, that it would be environmentally responsible to research the Raspberry Pi's performance based on the included RAM, which is far greater than many early operating systems, and the performance would rival many of the NVME carrier boards that are being developed. It is a long established fact that the hierarchy of computer speed is L1>L2>L3>DRAM>SSD>HDD....

    Read more »

  • What is Innovation? A review of 3 common types

    02/27/2021 at 19:53 0 comments

    Innovation is a central part of hacker/maker culture. What does it mean to innovate? The reasons may be personal or for entrepreneurial purpose, but the meaning is the same. In this blog post, I will briefly examine 2 type of innovation, and will review a third type of innovation. There are certainly other type of innovation, but I will focus on these in the nexus of science and technology.

    1. The discovery of a natural property, such as electricity or magnetism, and the development of a product, such as the lightbulb, or inductor.

    2. The modification/or redevelopment of an existing product, such as the incandescent bulb, to make a more efficient light, such as a CFL or LED.

    3. The transplantation of an existing product- such as a lightbulb, into another product, such as a car, to produce the headlight. 

    The third type of innovation interests me the most, because there are brilliant inventors in many different fields. But it is a bit wayside or unnatural for some people to be receptive to an idea like powering a a headlight by an internal combustion engine, even with a battery. I'm sure the idea eventually caught on, considering its practicality but how long did it take for the idea to be adopted? This is the core struggle of innovation. It not only faces a struggle in its own right- that of developing something new or modified for a new enhancement, but also faces a public adoption, where many more applications can be used, and is often a main reason for promoting a technology, as opposed to niche features. 

    Open-source ideology, is a very great concept. If one looked back into the history of open source, one finds a very strong push to establish what I believe was the first linux operating system: 

    from https://www.hipeac.net/vision/2021/ (111MB)

    smaller 6-page version here: https://cdn.hackaday.io/files/1777167603401344/192-197%20-%20Copy.pdf

    "The first point is to understand that

    there are fundamentally two separate

    families of open source licence. What we call
    permissive licences (Apache, MIT, BSD)
    basically allow your users to take what you
    have provided, use it, modify it and even
    sell it. They do not even have to tell you
    what they are doing with it. Most 

    annoyingly, they can take what you have 

    started and, when they make something better

    out of it, they do not have to share it with
    anyone else. Particularly at the beginning
    of the open source movement, this was
    seen as a major problem, and so called
    reciprocal licences were developed (GPL,
    LGPL). This second family of licence asks
    the user to make systems built using what
    they have received openly available under
    the very same licence."

    Why was it a "major" problem? To start, Ubuntu and the Linux kernel didn't exist then. Today, a free and easily downloadable ISO is almost taken for granted. I do not know how many developers there were who wrote the first kernel-3 at least? Since then there have been thousands of projects and forks of projects. That is obviously normal because there was no need to develop anything "centralized" anymore, once the software is developed, the hardware is a matter of aesthetics. Yet, as more developers seek to adopt open-hardware projects, I think of some islands of development that are stratified in their capability. There could be perhaps, more "mega-projects" to develop some commonality in terms of a desired hardware components, such as an a mini-ITX-like motherboard in a Raspberry Pi form factor. Of course I am plugging my own laptop project now, but its more about suggesting any mega-project that has a lot of features that a generation of developers would want to use. This isn't to say there is not already a lot of effort towards something like this. What would an open-hardware product look like? Maybe it would use a open RISC core like the Berkeley Out-of-Order Machine https://boom-core.org/  If someone wanted to develop an open-source...

    Read more »

  • There is an uncanny valley between microcontrollers and microprocessors

    02/27/2021 at 04:11 0 comments

    Why not just use a microprocessor?

    Why not use a microcontroller?

    These are the questions I anticipate. I think there is a mutually exclusive comfort zone for each preference, and a strong revulsion in between. Let's build a microcontroller that operates like a microprocessor. Or a microprocessor that resembles a microcontroller?