• Will Fastie: How to speak machine

    Home » Forums » Newsletter and Homepage topics » Will Fastie: How to speak machine

    Author
    Topic
    #2451135

    ISSUE 19.23 • 2022-06-06 PROFILES By Chris Husted This newsletter’s editor in chief is proficient in more than 16 programming languages. At many of th
    [See the full post at: Will Fastie: How to speak machine]

    5 users thanked author for this post.
    Viewing 13 reply threads
    Author
    Replies
    • #2451232

      All of us know that Photoshop is the gold standard for graphics software.

      For a number of years, I have used Paint Shop Pro, because it is quite affordable and provides essentially all the features of Photoshop that I need.

      Paint Shop Pro has one disadvantage that irks me–it is the slowest software to load that I have EVER used.  It typically takes from 40 seconds to a minute and a half to load when I click on its icon.  I am using a fast Windows 10 machine with plenty of RAM.

      Thinking that the problem might be on my end, I contacted Corel customer support, and their “third-tier” technical support person said that Paint Shop Pro was programmed in Python.

      I have programmed in C and its variations, as well as Python.  Is there something inherently slower in Python than C, for example?

       

      • #2451269

        It typically takes from 40 seconds to a minute and a half to load when I click on its icon.

        I also use Paint Shop Pro and mine opens almost instantly when I click the icon so it “appears” there’s something on your PC that’s slowing it down — maybe your Anti-Virus program is set to scan all programs before they start

        System specs:

          Windows 10 Pro 21H2
          ROG Maximus XI Gene Motherboard
          Intel i7-9700K CPU (clocked at ~4.9 GHz)
          NVIDIA GeForce GTX 1060 (6GB Founders Edition)
          32GB memory (G.SKILL Ripjaws V DDR4-4266 (2 x 16GB)
          Samsung 860 EVO SSD.

        • #2451286

          That is comforting to hear.  I will investigate further.  Thank you for the quick response.

          Ken

           

           

    • #2451338

      Thanks for the memories. Right after college (1975) I worked in Operations Research and we bought an IBM 5100; it was probably the first computer portable enough to carry from one office to another, weighing around 50 pounds. It sat on the desktop, and ran either interpreted BASIC or APL – selected by a rocker switch. It used cassette tapes for storage, with a 5-inch monochrome display and 64 Kilobytes of RAM. It also attached to an IBM 5103 dot-matrix printer.
      We used it to to create mathematical models which often had to be left running overnight or over the weekend. It was a great alternative to using the IBM 360 mainframe. Deep inside, that 5100 was actually running IBM 360 microcode.
      That was six years before the IBM 5150, and well before the Internet (but ARPANet was running).

    • #2451350

      I started programming almost fifty five years ago, working with a Digital PDP 8, programming it in assembly by using a special typewriter to punch code and data as rows of holes on paper tape.

      Then Fortran came out and it was a revolution for those of us in engineering: a higher order language with statements humans could understand. As did the key-punched Hollerit-code cards: much superior to work with than the spools of paper tape had been.

      Then, in my own case, came UNIX and C, followed by the rest of the C family. Algol and Cobol never were things I needed, so I did not use, or learned.

      Then, decades later I started to work for NASA and, lo and behold, because there was so much legacy software in Fortran to cannibalize into new software, and many of the technical people were Fortran people, I started working more and more with the successive versions of Fortran, and here I am today.

      Interestingly enough, modern, vectorized Fortran is still in use for engineering and scientific number-crunching using supercomputers, for weather prediction models and climate models, and in places like CERN, to analyze the cascades of super-collider data. Not exclusively, of course, but it has not died out, as many may think it has.

      Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

      MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
      Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
      macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

    • #2451359

      IBM 5100

      The IBM reps were at my door trying to sell me Series/1 and then 5100s. But they didn’t try too hard, as we were competing with them in the point-of-sale market. The last thing they wanted was our system running on their minis. Of course, that was the last thing we wanted, too.

    • #2451364

      Digital PDP 8

      At General Instrument, we used the direct descendant of the PHP-8, the Data General Nova 1200. DEC was always sniffing around to get us to buy PDP-11s, but they couldn’t match the performance.

      • #2452638

        And then one day came DEC’s VAX line, and later Alpha!  Wonderful programming environment, and properly coded were a lot faster than bad programmers realized.

    • #2451366

      I took a Fortran course in college and thought it easy to learn.  I found the teacher attractive, so I decided to learn IBM BAL (Assembler) over the summer on my own to impress her when I took her BAL class in the fall (and she was impressed).  I wound up switching my major from mechanical engineering to CS.  And also had a bit of a fling with the teacher [lol].

      Unlike a lot of people, I found BAL easy to master and similarly to Will, thought that   understanding how everything works at the system level builds a strong foundation.  I wrote a basic OS for my senior project and after graduation, went to work as a system programmer where I was able to muck around with actual IBM OS code (they released the source code back then).  I also wrote a decent disassembler among other projects.

      However, the money wasn’t there even after making a couple of job changes and so I moved to the vendor world doing sales support and then in the mid-90’s transitioning to technology sales.

      Looking back, especially when the sales weren’t rolling in, there were many times when I wished I had stayed in system programming, which was brought home recently via a few recent articles I read about how much compensation has escalated in our web driven world.  I understand that these days, it is not uncommon to make $200-300k at companies like Google or Facebook (+ benefits).  Now consider this article:

      I quit my job as a Netflix engineer making $450,000 a year. The money wasn’t worth the boredom.
      Jun 1, 2022, 6:51 AM

      https://www.businessinsider.com/quit-job-at-netflix-senior-engineer-boredom-six-figures-2022-5

      This is a far cry from the $11k annually that I was paid in my first full time job back in 1979!  Or the $62k base (+ $20k bonus) I was making working as an SE for a major mainframe vendor company in 1992.  Whew…

      • #2452641

        BAL was fun – and those of us taking the BAL classes (with the guy who did the programming for the State College system in PA) really enjoyed it, and knowing that we were much better programmers than those who didn’t get to know and understand it.  Our prof did push us, but he also taught us how to use BAL with COBOL to do things that COBOL couldn’t really do back then – mid 1970’s.  He used that to create a realtime student registration system, around 1974.

         

        And what you said about pay – right on the button.  I got my first IT job fall 1979 – officially as a trainee at Manny Hanny in NYC – though I could have taught the class.  The pay was awful, but getting one’s first job in IT in banking in NYC was important.

    • #2451392

      understanding how everything works at the system level builds a strong foundation

      I certainly stand by what I related to Chris about that, but if you happen to be 75-year-old COBOL programmer, especially one who knows how to patch in machine code, you can make $500K per year maintaining 60-year-old code.

      There’s a lot of COBOL still out there.

      1 user thanked author for this post.
      • #2451402

        I learned COBOL back in the 1960s, and then Fortran, C, etc.

        When many people were afraid of the milennium bug, I spent a lot of time telling people and companies that they did not need to worry.  Almost none of them had any real problem.

        In the 1970s, I found myself on a train in Washington, DC.  On one side of the car sat Grace Hooper, and on the other side of the car, next to me, sat a Georgetown University student holding a COBOL book.  I said to him, ” That is Grace Hooper sitting there.  If you ask, I am sure she would autograph your COBOL book.”  He laughed and said, “No, I am sure she is dead by now.”  I said, “Look at the name tag on her uniform.”  He did, and she graciously signed his COBOL book.

        My meager COBOL skills notwithstanding, I always considered COBOL to be a “stupid” language.  I am still not convinced that it was necessary, although many businesses used it between 1960 and 2000.

        Ken

         

        • #2452642

          I wished I’d met Amazing Grace, AKA Rear Admiral Hopper.  It’s important to understand that COBOL was a game changer, allowing a good programmer to write good procedural code for many purposes, especially business processing.  With a good OS behind it, like VAX/VMS, it became amazingly powerful, yet potentially ‘self documenting’ (when done properly).

          My brother programmed in APL and Fortran – which were far more appropriate to his work in crunching numbers.  My programming was about business systems, and crunching data to understand that business, and to get work done.  For instance, taking IBM mainframe data, converting it into Lotus 123 worksheets (all this code in COBOL with VMS services) , which were then downloaded to early PC’s for analysis.  Previous attempts had failed.  My COBOL program described the data, and needed maybe 30 lines of code to do the actual work.  Work that my IBM based counterparts couldn’t do, and which they insisted I could not do on my early VAX.

          2 users thanked author for this post.
      • #2451461

        Woulda, shoulda, coulda…

        I took a Cobol course in college for a semester.  Aced it but was totally bored by it.  I was a going to be a SYSTEM PROGRAMMER working on the OS, not simple application level programs! 😉  The hubris of youth….

        Are people still coding BAL?  I could probably resurrect that skill relatively easily.

        • #2452709

          ibe98765: “Are people still coding in BAL?

          Yes, for special applications where speed, economy of computer resources, etc. is necessary. For example: some of this is military or space-related, where the size, power, etc. of electronics is limited. Also for creating, modifying or fixing compilers, that translate human-level languages to assembler. There may be other types of uses. But, fundamentally, knowing to program in assembler is very useful, because it gives one a feeling for programming based an intimate knowledge of how a computer works, and knowing this makes it easier and more likely to program well.

          Ex-Windows user (Win. 98, XP, 7); since mid-2017 using also macOS. Presently on Monterey 12.15 & sometimes running also Linux (Mint).

          MacBook Pro circa mid-2015, 15" display, with 16GB 1600 GHz DDR3 RAM, 1 TB SSD, a Haswell architecture Intel CPU with 4 Cores and 8 Threads model i7-4870HQ @ 2.50GHz.
          Intel Iris Pro GPU with Built-in Bus, VRAM 1.5 GB, Display 2880 x 1800 Retina, 24-Bit color.
          macOS Monterey; browsers: Waterfox "Current", Vivaldi and (now and then) Chrome; security apps. Intego AV

    • #2451418

      I always considered COBOL to be a “stupid” language.

      COBOL was created in 1959. Among other things, that means there were limited resources. The compiler could only be so big, and it turns out that COBOL is simple to compile compared to other programs – it’s extremely structured. It also wasn’t all that hard to learn; when I was originally looking for a job, the Social Security Administration accepted me and would have taught me COBOL from scratch. Given the era, it’s easy to see why it would have been widely adopted in business.

      I am still not convinced that it was necessary

      It was available.

      1 user thanked author for this post.
    • #2451455

      COBOL was created in 1959. Among other things, that means there were limited resources.

      Cobol was great. I have developed hundreds banking programs in Cobol on room size computers. Cobol is still is use.

    • #2451506

      Philosophy that works for everything:

      • If you make it look easy, everyone thinks they can do it.
      • If you make it look fun, everyone wants to do it.
      On permanent hiatus {with backup and coffee}
      offline▸ Win10Pro 2004.19041.572 x64 i3-3220 RAM8GB HDD Firefox83.0b3 WindowsDefender
      offline▸ Acer TravelMate P215-52 RAM8GB Win11Pro 22H2.22621.1265 x64 i5-10210U SSD Firefox106.0 MicrosoftDefender
      online▸ Win11Pro 22H2.22621.1992 x64 i5-9400 RAM16GB HDD Firefox116.0b3 MicrosoftDefender
      • #2452643

        Other programmers didn’t think a lot of what I did was possible.  I sure wasn’t going to tell most of them how I did things – though I did do one article in a trade journal on how do do something in VMS in a certain language that DEC had not documented for that language.  A 3rd party vendor had long advertised a utility of theirs that you could embed in a program, but it was very expensive.  My article came out, their ads disappeared!  My only regret – I didn’t make any money from that!  But it was fun, and a satisfying challenge.  Though my bosses didn’t know how much I enjoyed doing it.

        1 user thanked author for this post.
    • #2452636

      I too enjoyed the article on Will Fastie.  I programmed for about 20 years, late 1970’s to late 1990’s.  Along the way, I came to understand that one could never under estimate the ability of users to mis use or misunderstand a system even if it was written just for them.  From that came the realization that GUI ease of use was just as important as the functionality of a system.  Of course, the only way to effectively make that point to management was to build a system that senior people who hated computers would be willing to use.  And once one has done that for real, and gotten the hoped for results, it’s easy to resist writing systems that are hard to use.

      Now I coded mostly in a pre web era and environment, and only coded in a few languages, including VAX COBOL, but learned BAL (Assembler) early on, and with that got a good understanding of what my ‘high level’ code actually did.  Reading dumps from my early COBOL code might have been somewhat painful, but was also enlightening.  Today’s new programmers need some of that training before they are let loose on an unsuspecting public!

      2 users thanked author for this post.
    • #2452678

      Work that my IBM based counterparts couldn’t do, and which they insisted I could not do on my early VAX.

      I’ve always been amazed by being told what I couldn’t do, especially after I’d done it.

      • #2452727

        I spent a fair amount of time tracking down documentation on LOTUS.  One day their engineers sent me an internal manual / guide to the innards of a 123.WKS file, including its binary format.  Later on they publicly released the same for their .WK1 files, where the main change I recall is reduced storage space needed – somethings stored in variable length fields instead of fixed length, which of course changed the internal format.  The data came to me by magtape from the mainframes, originally blocked at one 80 byte record per block.  I told the IBM engineers to change that to 32k records per block.  They responded that your VAX can’t read that.  Hmm – the coding change took about a minute – just changing the blocking factor.  (I loved how fast the tape went through the drive, spinning almost continuously)  The data was in EBCDIC.  VMS had a one line system call – OUT = function(IN), essentially.  The crazy part was the binary conversion because VMS and DOS had the nibbles reversed, plus the format was a bit complex, especially for the .WK1 files.

        Then SMARTTERM to download the spreadsheet to the PC, open Lotus, open the file, work on it.  That project was a lot of fun, though it did take patience.  I never would have tried it if it wasn’t for my BAL training, in understanding how data looks underneath it all.

        IOTW, even though I wasn’t using BAL for programming, the training helped me better understand how compilers worked, what my coding was doing.  For instance, buffer overflow is a frequent problem security wise.  It was easy to see in BAL, because BAL didn’t care what you moved from point A to B.  X bytes from Address A to Address B.  You could write a BAL program that changed its own code as it ran.  Nothing was protected except by how the programmer coded something.

        We discovered (coworkers and programmers that we encountered) that DEC DCL – Digital Command Language, could demonstrate this easily.  Like BAL, it didn’t have a lot of constraints.  You could code a command line that did very different things than it looked like it could, by assembling bits and pieces of code from multiple sources, including what you typed to answer prompts.  Years later I learned that many computer vulnerabilities came because programmers didn’t understand these openings and protect against them, and neither did OS’s or Compilers.  I’m guessing that few people at MS, for instance, are properly trained the way we were.

    • #2452679

      I wished I’d met Amazing Grace, AKA Rear Admiral Hopper.

      I did meet her. And she gave me a gift, a one-foot long piece of wire, the type that was used to wire-wrap components onto a circuit board. She carried a basket full of them and handed them out to everyone, saying “Here’s a nanosecond.”

      2 users thanked author for this post.
    • #2452729

      I’d loved to have gotten one from her, but I’m thinking I would have been silly enough to ask her for a spare computer bug!

    Viewing 13 reply threads
    Reply To: Will Fastie: How to speak machine

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: