• Scam Alert. Members are reminded to NOT send money to buy anything. Don't buy things remote and have it shipped - go get it yourself, pay in person, and take your equipment with you. Scammers have burned people on this forum. Urgency, secrecy, excuses, selling for friend, newish members, FUD, are RED FLAGS. A video conference call is not adequate assurance. Face to face interactions are required. Please report suspicions to the forum admins. Stay Safe - anyone can get scammed.

Tales from the S-100 dark side.

slow-poke

Ultra Member
C# is alive and well on almost any platform. There used to be a time when myth had it that assembler coding was needed for interrupt handlers. Even in 1986 that wasn't true anymore on 68k systems. And given the intervening decades likely not true for anoy other architecture either barring PIC and similar. Maybe not there either.

I started with assembler on Z80 and Univac 1100's, moved to PL/I, Fortran, Cobol and eventually C. Hated C++, love C#. tolerate Python but not a fan of magic indent rules . Don't get me started on the mess created by the Arduino style of C/C++.
Completely agree, to really appreciate what is going on with the uC, writing some actual machine code at some point is really worthwhile, I programmed my third robot all in assembly code.

I have a decent amount of experience with C, I'm not a fan of C++ either, I have never used C#, what do you like about C#?
 

jcdammeyer

John
Premium Member
I have long thought that "the kids these days" who get their start learning with a high level language have missed the understanding of CPU architecture that you can't help but get at least some of by coding with assembler. I progressed from Assembler to C, to Delphi. Then did contract work for a government agency for Y2K and had to fix up a Dog's Breakfast of applications using everything from COBOL(which was new to me) , DB III, Visual Basic, to name a few. At my last job before I left this all behind, I finally got to C#, with the company's main product in Windows, but then with MONO developing a new application for the BeagleBoard and finally porting the main Windows program to a Linux version on a regular PC. I found C# to be incredibly productive, but have not kept up with what's happened to/with since MS pretty much clobbered MONO by opening up parts (at least?) of it to open-ish source.

I really liked the BeagleBoard - if I ever find time to get back into tinkering with programming and electronic gadgetry I'll likely get one.
Where are you located?
 

jcdammeyer

John
Premium Member
C# is alive and well on almost any platform. There used to be a time when myth had it that assembler coding was needed for interrupt handlers. Even in 1986 that wasn't true anymore on 68k systems. And given the intervening decades likely not true for anoy other architecture either barring PIC and similar. Maybe not there either.

I started with assembler on Z80 and Univac 1100's, moved to PL/I, Fortran, Cobol and eventually C. Hated C++, love C#. tolerate Python but not a fan of magic indent rules . Don't get me started on the mess created by the Arduino style of C/C++.
If I were to put it in a nutshell IMHO the problem with C++, C#, Python, Arduino C is they all use a heap and garbage collection so ideally memory doesn't get fragmented.

However, it's quite possible to write software that looses a byte or two on a periodic basis over time. So 14 hours into the program, for some odd reason the program resets or locks up. No real easy way to find out what's going on because depending on what the program does it may take 12 hours or 18 hours but the memory leak or fragmentation is a real problem.

We're stuck with the Heisenberg Uncertainty Principal. Add some code to debug it and the problem changes or vanishes.

Staying with software that doesn't do garbage collection for embedded systems means C++, C#, Object Pascal or any type of object oriented program is out of the question.

Again. IMHO
 

Dabbler

ersatz engineer
I spent the first 15 years of my career writing embedded systems. I agree - modern languages are far better than their versions in the 70s and 80s, but aren't perfect.

The trouble is that ASM is going out of fashion. Forth is largely defunct. That leaves C - if you stay away from string functions.
 

trlvn

Ultra Member
If I were to put it in a nutshell IMHO the problem with C++, C#, Python, Arduino C is they all use a heap and garbage collection so ideally memory doesn't get fragmented.

However, it's quite possible to write software that looses a byte or two on a periodic basis over time. So 14 hours into the program, for some odd reason the program resets or locks up. No real easy way to find out what's going on because depending on what the program does it may take 12 hours or 18 hours but the memory leak or fragmentation is a real problem.

We're stuck with the Heisenberg Uncertainty Principal. Add some code to debug it and the problem changes or vanishes.

Staying with software that doesn't do garbage collection for embedded systems means C++, C#, Object Pascal or any type of object oriented program is out of the question.

Again. IMHO
This is a bit beyond my pay grade, but AIUI, automatic reference counting in Objective-C and, more lately in Swift, is all about avoiding the problems related to garbage collection. Both those languages are, of course, most strongly associated with Apple.

Craig
 

mickeyf

Well-Known Member
@jcdammeyer
Where are you located?
After about 25 years mostly in Victoria, now outside of Duncan for the past 1-1/2 years post retirement.

Daughter shared this link to our family this morning - I pointed out that she didn't know that Ethernet cable comes in "Cat5", Cat6" etc. and what those meant.


Another quote from someone: "That bloated monstrosity that is C++" Oops - "bordering on Religion". Feel free to flag or delete....

@slow-poke
what do you like about C#?

I don't recall ever running into issues with garbage collection, but perhaps I was either more careful or more lucky, (or just never let the program run long enough!!! - though you'd think a few of weeks uptime would expose that sort of thing.) The fact that I didn't need to think about allocating and de-allocating memory let me focus on the actual problems I was trying to solve. I loved C, and it still has a fond place in my programmer's heart. In fact in my last job I wrote some C drivers to interface with the C# main programs.

Being able to define a .. I have to call it an "object"... That listened for inputs, handled any number of conditions, performed any kind of desired operations on them, removed itself as needed, and then to be able to generate any number of those by things name as needed was incredible useful in the context of what I was working on. Maybe not the best tool for everything.
 

mickeyf

Well-Known Member
This thread reinforces to me that if you're into technology of whatever sort, you're likely into technology of Whatever sort.
 

jcdammeyer

John
Premium Member
After some support from the Lazarus Free Pascal Group I learned what is needed to make an application show the menu bar that is automatically displayed in Windows and Linux.

Now when the MakeBore_IJ program is active the menu line at the top of the screen changes to the menu bar that is normally in the user window.
 

Attachments

  • MakeBore_MAC_OS.jpg
    MakeBore_MAC_OS.jpg
    212.8 KB · Views: 1

jcdammeyer

John
Premium Member
Gotta be a cool story behind that there facility of yours......
I thought I had a photo of all the puppies but at the moment can't find it. Here's just one of the capes hiding in the dog house.
1715033619499.png

I have a stack of BeagleBone books over a foot high.

Only place I've actually used one permanently is with this cape. Front side view.

1715033751800.png
Back side view which shows the BeagleBone Black.
1715033801268.png
And here it is installed in the M68K OS-9 system emulating an ST-506 Hard Drive.
1715034066666.png
 

jcdammeyer

John
Premium Member
Why have I never heard of such a thing! It's like discovering that there are girls in the world.....
What's really cool about it is taking a partly functioning ST-506 hard drive and using the BBB as the Disk Controller Recovering data from the hard drive and creating an image. Not a big deal with a 32GB SD card on the Beagle connected to a hard drive that had 15MB. Room for multiple tries at an image.

Then the cables are moved around and a jumper set and now the cables are connected to the computer and the Beagle is told to run the emulation software with that image. Those blue capacitors on the bottom of the board are super caps in the Farad range and allow the BBB to run for almost a minute after the original 12V Computer power goes away. It has time to properly close files and shut itself off so no data is lost or corrupted.

Here's some of the output I had logged from (I can't believe) two years ago.
Last login: Sun May 1 18:10:27 2022 from corsair root@beaglebone:~# cd mfm root@beaglebone:~/mfm# ./setup_mfm_read Rev C Board root@beaglebone:~/mfm# ./mfm_read --format Intel_iSBC_214_256B --sectors 32,0 heads 8 --cylinders 320 --header_crc 0xffff,0x1021,16,0 --data_crc 0xffffffff,0x140a0445,32,6 --sector_length 256 --retries 50,4 --drive 1 --extracted_data_file rodime_ro204a --emulation_file ../emufile_a Board revision C detected Returning to track 0 Retries failed cyl 0 head 0 Bad sectors on cylinder 0 head 0: 18H ECC Corrections on cylinder 0 head 0: 7(1) Ran out of data on sector index 0. Track short -2050 bits from expected length. Either deltas lost or index pulse early Ran out of data on sector index 0. ... All sectors recovered after 5 retries cyl 39 head 1 ECC Corrections on cylinder 39 head 1: 10(1) 26(1) All sectors recovered after 1 retries cyl 41 head 0 All sectors recovered after 4 retries cyl 41 head 1 All sectors recovered after 7 retries cyl 43 head 0 All sectors recovered from multiple reads after 14 retries cyl 43 head 1
 
Top