If anybody has any questions relating to coding or computer problems, feel free to message me!

mechanicX said:
Good to know.
If you have any questions, let me know. :)
 
SwedishAMAZING said:
If you have any questions, let me know. :)
I have many. , probably to many. I made a thread in another section of the forum.
 
SwedishAMAZING said:
You came to the right place my friend. I have quite a bit of experience explaining things to the "technologically impaired" :)


Here, I've included pictures, for you to see what I'm doing.


First, scroll up (that is, move up, go up) to the very top of where you are right now. It should look something similar to this:
View attachment 78588


Now, click (select) the little image right where I've circled for you in red. When you do, it will look something like this:



View attachment 78589

Next, all you need to do is click (select) the big image I've circled in red. For you, it will be a great big "S". Once you click (select) the big "S", it will look something like this :




View attachment 78590

Once you see this little menu, all you need to do is click (select) the button I've drawn a red arrow to. When you do this, it might look different for you than I've pictured in this next picture:
View attachment 78591

Now, what you will do is pick a picture that you have on your phone or computer (whichever you are using at the moment). For me, on my phone, this is what I did: View attachment 78592
Once you pick a picture you like (make it a fun one!), select it and that should be it. I chose the little guy I circled in red above. I clicked him and then selected him, and when that was all done, this is what I saw:
View attachment 78593
Now just click (select) the "okay" button I've pointed to with a red arrow. After that's done, admire your brand new avatar (profile picture, a picture of you, or a pet, or of a diaper, anything, that is kinda a face to what your saying online; it represents you, makes you memorable)


That should be all :)

And by the way, TikTok is just short, moslty funny videos, and is easier to use. Instagram is mostly more serious videos, I believe. Twitter is simply posting short messages to a publicly visible place, similar to this website. And I don't like PowerPoint, never will. ;) Avatar was a long film, wasn't it?

Thank you for the question, let me know if I didn't explain something simply enough, or if you have any more questions. Good luck!
I DID IT! How can I possibly Thank You enough? I can think of so many things that I had REALLY WANTED for an avatar, but this one just, "felt right" You really helped me out! AND I read some of your other posts as well. So, I have some things to say to you. Believe me, all of them are heartfelt and display my appreciation for how much help you are!

So, TikTok is funny, but Instagram is serious? So, it's like, "Hey Guys! Let's watch, 'Anchorman!' Then, afterwards, we can watch, 'Lawrence of Arabia?'" This makes sense because?

PowerPoint: Had to. Decided to go back to college after over 25 years. They don't even USE books anymore. And PowerPoint is EXPECTED when using pretty much ANY presentation. God, some of these Professors want like, a PowerPoint that looks like a Rubik's Cube.

I must disagree with you. "Avatar" was not a long film. "Avatar" was a horrid torture experiment that I screamed to make them stop. Hey wait, that sort of reminds me of the actual avatar that I chose just now. That's freaky. Didn't even notice that.

I saw you talking about old video games:

Everything that was Wrong with Intellivision: Stupid disc that didn't actually work as a controller. Let's move on.

How can these kids use these new Playstation controllers? It took me two years just to learn where all the buttons on "Defender" were.

Oh, what the heck anyway? I am and always will be, A Pinball Wizard.

Thank You SO Much!
 
I have a long winded question. I shall try to exsplain in the best way it flows through my brain to the words...
When it comes to coding or programming I am having monumental trouble understanding how to code , using constants , variables , there are many many many many books I have which claim to be "easy" fast track this fast track that. , it has taken me along time to understand what they mean , it is not clear cut I get that the program will run in machine language once compiled , I just can't link the wording with the binary equivalent, when i go down that rabbit hole I start seeing bit this. bit that, big endian. little endian. , flipped bits , WORD sizes. Then binary strings going into the processor through the alu, then going out to specific pins then I go down another rabbit hole regarding how down that tally up with a program to a processor or microcontroller pin out. , stuck is not the word. Frustrated - need to keep my mind active
Any guidance, books to read ( good ones ) 👍
 
mechanicX said:
I have a long winded question. I shall try to exsplain in the best way it flows through my brain to the words...
When it comes to coding or programming I am having monumental trouble understanding how to code , using constants , variables , there are many many many many books I have which claim to be "easy" fast track this fast track that. , it has taken me along time to understand what they mean , it is not clear cut I get that the program will run in machine language once compiled , I just can't link the wording with the binary equivalent, when i go down that rabbit hole I start seeing bit this. bit that, big endian. little endian. , flipped bits , WORD sizes. Then binary strings going into the processor through the alu, then going out to specific pins then I go down another rabbit hole regarding how down that tally up with a program to a processor or microcontroller pin out. , stuck is not the word. Frustrated - need to keep my mind active
Any guidance, books to read ( good ones ) 👍
IMO the best way to deal with all that while coding is to ignore it.
Languages starting with assembly and moving up into C and even more abstract stuff - all exist so that no one has to deal with the details of machine code and hardware except when necessary.

The process of linking the words you write with the compiled binary is difficult to trace because it's a complex system built up over decades so that a compiler can compile any code into a binary that will work natively and in the same way on any supported CPU, as much as possible. The only way to do that is case by case.

Of course curiosity is cool and fun, so if you're really interested you could do some digital logic design and make a 8 bit computer, or try doing something simple in an assembly language. There are even emulators for both of those so that you can do everything in software, if you want.

Also, try playing with a decompiler sometime - that's something that takes a binary and generates source code based on it, so the reverse of a compiler. It won't be the same as the original source code of course, but looking at what those things come up with is fun.
 
Also a programmer here! Always love seeing other programmers "in the wild". I also do freelance next to my day job. Programming, DevOps and SysAdmin here.
 
Last edited:
mechanicX said:
I have a long winded question. I shall try to exsplain in the best way it flows through my brain to the words...
When it comes to coding or programming I am having monumental trouble understanding how to code , using constants , variables , there are many many many many books I have which claim to be "easy" fast track this fast track that. , it has taken me along time to understand what they mean , it is not clear cut I get that the program will run in machine language once compiled , I just can't link the wording with the binary equivalent, when i go down that rabbit hole I start seeing bit this. bit that, big endian. little endian. , flipped bits , WORD sizes. Then binary strings going into the processor through the alu, then going out to specific pins then I go down another rabbit hole regarding how down that tally up with a program to a processor or microcontroller pin out. , stuck is not the word. Frustrated - need to keep my mind active
Any guidance, books to read ( good ones ) 👍
I really apologize for leaving this thread abandoned! Well, let me just say: this is VERY common for new programmers, even seasoned ones (I speak from experience.) It will make you want to give up. It will make you want to quit, it really will! But I promise, if you simply STAY FOCUSED, you will be fine. I recommend the youtuber Mosh Hamedani. He is very good, I always recommend it to people who ask me for help. Another piece of advice: pick a language, and stick with it! Pick ONE book, and stick with it! pick one youtube video, and finish, and then do it again! There are so many rabbit holes, that you can easily go down, and most of the information located in them is so specific to a certain situation, you will never, EVER, need it. A story: I was 18, just moved out of my parents house, and into an apartment in the south of goteburg, and I was staying up late every night, learning. Programming, studying. I had already been programming for 5 years at that point, but I was still addicted. I found myself getting lost. Lost in the information. the vast, sheer amount of information, is wild. I caught myself reading one particular night an article, a thread, i dont know what it was, about how to program a calculator in Cobol. COBOL! I'm not going to get into a fight over the revelance of this shitshow that was cobol, I, in my career, will never use it! Ever. I went to sleep early that night.

Stay on task! You got this, brother.
 
  • Like
Reactions: WillFord384
Oh... my thing is C, assembly, and microprocessor architecture and some VHDL. I prefer being close to the metal.

It might help to view some microprocessor and electronics material. There is a guy who did a very good series on the Mororola 68000 including how to single step it using DTACK as it doesn't have the typical READY or WAIT line. And these old CPUs are not fully static meaning you have to keep the clock going at some minimum speed or the register and state contents fade and you lock up/crash/reset/?????. He then proceeds to set up a reset and clock circuit and starts up the 68000 like a brain in a jar with the address and data lines hooked up to mechanical switches and shows the bus cycle 1 clock at a time from instruction parts fetches to output and write back. Eg a move long 4 byte immediate to register takes 3x16 bit bus cycles or 12 clock cycles just to fetch the instruction and it's parameter from memory.

Even goes over the concept of switch bounce and just how fast a CPU is that 1 switch flip can look like hundreds even thousands of toggles on and off rapidly in a instant.

If you alternate between C and the debugger to view the assembly enough eventually C just looks like a more readable assembler. Machine code opcodes is all that matters, everything else like assemblers and up are purely for our human convenience.

If you are trying to understand this stuff withe loosey goosey emulated languages and script languages like Java or Python forget it, those are way too high level and abstract and don't map 1:1 to computer architecture. You need something strongly typed and architecture matched like C or Ada.

Then you can start understanding input latches, output latches, multiplexors, etc and combination all logic (ALU, control logic) vs sequential logic (fetch decode execute and timing sequencer and control logic/decode ROM/microcode buffers) work together to make all this stuff tick.

Most of the stuff you're getting overwhelmed with is purely a matter of convention, eg engineer, eg human design choice. Whether something is stack relative, absolute, indirect, whatever, it's all the same just read or write an address by storing A0-Axx with R/W and other bus synchronization signals and either driving the D0- Dxx pins with a output driver or copying them into a input buffer. Even the concept of stack is just a convention that is useful enough that it gets dedicated hardware (registers , instructions, and addressing modes).

A really super useful and simple acedemic CPU is the 6502. Its a very simple in order non pipelined design of 3500 transitors, majority of which is the decode rom and carry lookahead ALU. You can stare at the block diagram enough to understand what is going on. For me when I started the last piece to tie it all together for me and pop the lightbulb in my head is understanding T states and the ring counter that is the literal heart beat and central pump of all CPU paths and circuits and dictating everything from power up until the end of time.

Where read and write signals inside the processor connect various input and output flip flops to various buses and enable or disable various bus transceivers to deliver data to and from the ALU and select operation to perform one circuit step at a time. (which are all being done in parallel, you are just selecting which circuit output you care about to store).

It all starts with the T0 to fetch the opcode, plug it into the address input of a decode ROM, which spits out multiple 130+ bit control word values that contain every circuit state of every controllable part in the CPU at any particular T state necessary to achieve an opcode's stated purpose in lockstep. The steps are all run in sequence by a Ring/Johnson counter to select the right inputs and outputs, select the operation of the ALU, wait for the combinational logic delay for the output to be valid and save it back, bump the program counter up and update flags, etc and reset the Johnson counter to go back to T0 for the start of the next fetch.

Really fascinating stuff. When this diagram starts to make since you'll start ton become language independent and feel like you now know every language even if you haven't seen it before.

BkZ9o.png


Modern CPUs at their heart are no different from this in concept just scaled up really massive and broken up into sections with memory buffers in-between everything making it all queue driven to decouple everything from each other or any given instruction. This is hardwired microcoded so the entire CPU is locked serving a single T state for a single instruction at any point time but it makes it very easy to understand in time.
 
Last edited:
  • Like
Reactions: WillFord384
BunnyFofo said:
IMO the best way to deal with all that while coding is to ignore it.
Languages starting with assembly and moving up into C and even more abstract stuff - all exist so that no one has to deal with the details of machine code and hardware except when necessary.

The process of linking the words you write with the compiled binary is difficult to trace because it's a complex system built up over decades so that a compiler can compile any code into a binary that will work natively and in the same way on any supported CPU, as much as possible. The only way to do that is case by case.

Of course curiosity is cool and fun, so if you're really interested you could do some digital logic design and make a 8 bit computer, or try doing something simple in an assembly language. There are even emulators for both of those so that you can do everything in software, if you want.

Also, try playing with a decompiler sometime - that's something that takes a binary and generates source code based on it, so the reverse of a compiler. It won't be the same as the original source code of course, but looking at what those things come up with is fun.
Unless it's something like 65816 where variable register sizes and operand sizes can be toggled at will with instructions by the user. Ugh. Lol. 🤣 x86 is a lot easier with its prefix and override bytes letting you know this info linear, in order, instruction by instruction, and the global state is buried in descriptors set up at BIOS or OS level and never changed.

For 65816 you'd need a disassembler capable of pseudo executing every code path with 100% coverage to make sure you're disassembling everything with correct alignment. Otherwise a branch not taken analysis could change accumulator from 16 bits to 8 bits and return/jmp back and all instruction decoding as 16 bits from that point on is misaligned garbage.

X86 prefixes are fun to play with, accessing 32 bit operands and instructions from 16 bit assemblers by declaring data bytes between instructions. Be hilarious using REX prefixes and movdq for 64 bits from 16 bit DOS debug but sadly I believe long mode did away with 16 bit virtual x86 mode to run DOS. Wonder if you can still access them from V86 in 32 bit mode or just straight up real mode fresh out of reset without invalid opcode faulting. 🤔🤫🤣
 
Last edited:
LittleAndAlone said:
X86 prefixes are fun to play with, accessing 32 bit operands and instructions from 16 bit assemblers by declaring data bytes between instructions. Be hilarious using REX prefixes and movdq for 64 bits from 16 bit DOS debug but sadly I believe long mode did away with 16 bit virtual x86 mode to run DOS. Wonder if you can still access them from V86 in 32 bit mode or just straight up real mode fresh out of reset without invalid opcode faulting. 🤔🤫🤣
I love you
 
  • Haha
Reactions: LittleAndAlone
Back
Top