Where did Bytes Come From? - Computerphile
Why do we have 8 bits in a byte? Professor Brailsford on the origins of the humble byte.
Why Use Binary?: • Why Use Binary? - Comp...
Babbage's Puzzle: • Babbage's Puzzle - Com...
Initial Orders II: • Bootstrapping EDSAC: I...
The Font Magicians: • The Font Magicians - C...
Colossus and Bletchley Park: • Colossus & Bletchley P...
/ computerphile
/ computer_phile
This video was filmed and edited by Sean Riley.
Computer Science at the University of Nottingham: bit.ly/nottscomputer
Computerphile is a sister project to Brady Haran's Numberphile. More at www.bradyharan.com
Пікірлер: 499
"At home I've got ten-digit log tables, a thick book of them" His bedtime reading, what a savage.
@ikrotzky
Жыл бұрын
“- Professor Brailsford: I checked this out weeks ago for a bit of light reading. - Ron Weasley: This is light?”
@blitzmensch
Жыл бұрын
@@ikrotzkyhahah you got me bro
This man ought to be declared a National Treasure.
@RustyTube
6 жыл бұрын
*International
@TheTwick
6 жыл бұрын
BertyFromDK He is the embodiment of the history of digital computers. I can listen to him talk for hours.
@BertGrink
6 жыл бұрын
TheTwick I completely agree. I wish i had had the opportunity to study under his guidance.
@AllanBrunoPetersen
6 жыл бұрын
Get this man on WFPB diet so we can have him around longer. :)
@thomasp2572
6 жыл бұрын
Then declare it
Also as a telecoms engineer, I would like to point out that 8 bits wasn't always a byte, so when bandwidth matters we call them "octets" to be sure that we really do mean 8 bits. The term is still in use today.
@AlexSmith-fs6ro
4 жыл бұрын
Yes indeed. Some computer hardware used 9 bits to refer a byte. Telecoms engineers began to use the word octet to mean exactly 8 bits.
@brendonholder2522
4 жыл бұрын
Hello, I am currently studying for my bachelors in electrical engineer and I'd like to chat with you about your career if that's okay?
@IronicHavoc
3 жыл бұрын
I mean the way he mentioned it in the video I wouldn't have assumed 8 bits was always called a byte.
@FaeTheo
3 жыл бұрын
He literally had a section where he talked about how 8bits weren't always a byte. 8:22
@twogitsinacar4811
2 жыл бұрын
True !
I wrote my first programs on an ICL1903a back in 1971 at the Lanchester Polytechnic where I read Biogeography. It had an Algol 60 compiler and a 24 bit word length and six bit bytes. The ICL1900 series was very widely used in U.K. academia and government.
I do so love Professor Brailsford's videos. I understand about 1% of what he says usually (although this video was more approachable) but I'm happy to listen to whatever he has to say. His enthusiasm and depth of knowledge is extraordinary and he has such a lovely way about him. :)
@ZeedijkMike
6 жыл бұрын
I can only agree. It's always a pleasure to watch him talk - even if some of it flies right over ones head (-;
@LindaLooUK
6 жыл бұрын
What an odd comment. I wasn't aware that I was pretending otherwise.
@woodywoodlstein9519
5 жыл бұрын
LindaLooUK same.
@woodywoodlstein9519
5 жыл бұрын
LindaLooUK his passion yes. Lol.
@Adrian_Finn
4 жыл бұрын
It is a special skill he has when even those who understand little of what he says will nonetheless pluck up and listen, I wish all educators were the same.
I liked that quite a bit
@LMABeste
6 жыл бұрын
*quyte
@NeilRoy
6 жыл бұрын
I see what you did there. ;)
@Roxor128
6 жыл бұрын
Alexander Robohm - If a byte is 8 binary digits, then would a quyte be 8 quaternary digits?
@baganatube
6 жыл бұрын
A qubit?
@EchoHeo
5 жыл бұрын
Roxor128 A quit?
One is a bit, four bits is a nibble, two nibbles is a byte so therefore two bytes is... a word? Calling two bytes a "word" just breaks the whole eating theme! I propose we rename two bytes as a "snack" and maybe four bytes a "meal"! ;) Great video anyhow, as always. Love Mr. Brailsford. And I definitely want to hear more from him about 8bit computing and IBM, though I think I know the answer to that one, having grown up with them.
@eideticex
6 жыл бұрын
So by your logic, 8 bytes (32 bits) would be a "feast"? Extending further would 16 bytes be a buffet? How many bits is a "take out"?
@PvblivsAelivs
6 жыл бұрын
For computer speak, it's spelled "nybble."
@ThePharphis
6 жыл бұрын
2 bytes is a brownie
@stevefrandsen
6 жыл бұрын
And if you lose a byte it could be a burp, losing 2 could be a hiccup, 4 a belch and 8 a puke, 16 a stomach pump and 32 DOA. Just continuing the food theme on a negative scale.
@ffggddss
6 жыл бұрын
Cut a byte in half and it's a nybble; so . . . Cut a word in half and call those 2 bytes, a syllable. Fred
In my Computer Systems class, my professor worked for IBM back in the day and told stories from when he worked there. One of those was the big debate over whether to call the byte either 8 or 6 bits.
@gwenynorisu6883
5 жыл бұрын
Surely that would have been a bysex?
Honestly, like anyone is going to refuse an invitation to learn more about that beautiful historical background! I am but a lowly Electronics Engineering student, but knowing how the pioneers of the field set about innovating and planning early platforms is really interesting because a lot of this deals with hardware solutions devised to solve practical problems. It reminds me of the (possibly) small group of amazing souls who perform the arduous task of optimising our compilers and working with lower level languages like fortran, C, assembly level etc. To imagine that people were just ploughing through these problems is somewhat humbling and also scary to imagine. Its like the wild wild west, where the concrete boundaries between electronics and computer science and computer engineering were super blurry, before embedded systems came along and everything became compact. Its like staring into the heart of my subject and that's what makes me appreciate these historical lessons. Please, keep em coming professor!
@gwenynorisu6883
5 жыл бұрын
Well, depends if we just get the well-worn story of the PC itself (which is somewhat fascinating, given how much of a departure it was from IBM's usual operating procedure, but has been done to death elsewhere), with which IBM were relatively late to market and only managed really to steal a leading share on the strength of their name (and, I guess, how easy it was to shove a relatively large amount of memory into it, as well as the option of a very sharp terminal-like display), or if we get the backstory behind how they spent most of the 70s, basically from the arrival of the highly promising Intel 8008 / 8080 onwards, trying and failing to produce a "minicomputer in a shoebox" type system that would capture the small-and-home-office imagination in the same way that all their rivals (and, up to 1981, much more successful dominators) seemed to pull of even without thinking about it. They had a lot of interesting and diverse stabs at it with gradually more PC-like offerings, but for some reason it's only the PC itself that finally saw meaningful sales. If it had also flopped, it's likely IBM would have abandoned the small-system arena altogether and remained concentrated on massive corporate mainframes and the like instead. Which, though I'm not really a fan of the PC and XT themselves, still would have been a shame, because they built some fine machines in the late 80s and early 90s, especially when branching out into laptops, and after an early falter with the CGA and (in most applications) EGA, produced some excellent and cost effective display standards, rather earlier than you might have expected, in the VGA and XGA.
Anyone else want Dr. Brailsfrod to narrate a complete 'history of computing' online course? He's got the perfect mix of domain specific knowledge and grandfatherly tone to make the topic interesting.
When I entered college in the mid-60's, the byte had yet to be standardized. We had a Bendix machine (two conjoined twin G-20's, called by the faculty, a "G-21"), which was on its way out the door, and which had four 7-bit bytes per word. It was replaced with a Univac 1108, which had six 6-bit bytes per word. And then along came an IBM 360/95, with four 8-bit bytes per word. After I graduated, some time in the 70's I believe, the Intel 8080 took off, putting the microprocessor on the map, big-time. It was really the trilobite of the computer era - they were everywhere, in every conceivable device. And by then, the 8-bit byte was firmly established. (There was sometimes also a 4-bit nibble [nybble?].) Fred
@profdaveb6384
6 жыл бұрын
Interesting story. But ,for the Bendix machine and the Univac 1108 , were those bytes actually directly addressable ? Or was it just that 7 bits and 6 bits were the character widths?
@ffggddss
6 жыл бұрын
They were the character widths, at least (the G-21 had a 127 or 128 character-set); as for addressability, I don't know. All we students were doing with either machine was programming it in ALGOL; we never dealt in assembly or any machine-code level addressing. It's certainly an interesting question. I kinda wish I knew how to find out. (BTW, this was at CMU in Pittsburgh. And I recall a researcher from the UK, Alan Bond, being there for a time, it would have been ca 1967-8, to develop a CRT-screen interface for the G-21. Pretty sure it was a vector-based graphics system. Any chance you might have known him? I don't know what university he was from in UK.) Fred
@gwenynorisu6883
5 жыл бұрын
Wow, there was actually a computer with a 7-bit (well, 28-bit) architecture? How utterly bizarre. Were the G20s individually 14-bit, or did the conjoining work in a different way?
Great video! I fondly remember when working on a Univac in the early 80s where we had 36-bit words that could hold *either* 4 9-bit (sort of) ASCII characters *or* 6 6-bit FIELDATA characters that were the standard character set used by the operating system :)
I would watch the video he talks about making at the end
@TheR971
6 жыл бұрын
I would watch him talking to garlic bread flying to space for 2.5 hours.
@9999rav
6 жыл бұрын
I see what you did there...
@Ice_Karma
6 жыл бұрын
Yes! This! So much this!
Brailsford's knowledge of the history/evolution of computing technology is always fascinating.
Professor Brailsford has really become a real storyteller / showman. That end especially :D
He was there when the first computer was made. My man literally interned for Alan Turing.
@CTimmerman
6 жыл бұрын
Turing proved that XOR gates could compute anything, but Babbage designed a universal calculator in the 19th century.
@gwenynorisu6883
5 жыл бұрын
Which is why I'm edgy about seeking clarification over what I heard elsewhere that contradicts this account. The written record is one thing, but right from the mouth of someone who was actually _there_ somewhat trumps it. It'd just be nice to make sure which one is indeed right.
@HashimAziz1
4 жыл бұрын
@@gwenynorisu6883 which part of it is contradicted?
This man oozes intelligence every time he opens his mouth.
@twothreebravo
6 жыл бұрын
And "Cool"
@RCassinello
6 жыл бұрын
He has a way of making everything sound exciting and interesting, too. I could watch him lecture about paint drying.
@woowooNeedsFaith
6 жыл бұрын
+Ronstar308 There would probably be some quite interesting physics involved if we got to the details. But that's not computer science...
@NVE
5 жыл бұрын
alot of experience.
@CandidDate
5 жыл бұрын
You all want to know the secret of representing space with a byte? Well, take a 2x2x2 cube and each of the eight 1x1x1 cubes within has a distinct value, 1 or 0. The only problem is when you look at this space-byte from the corner, there is one hidden bit. Now connect this way of representing space to a self driving car, and I claim you can represent the entire Universe with stacked space-bytes. Do any of you relate?
Really interesting story, I want to hear more about the 8bit micro please! :D
One of my favorite channels when it comes to computing. Such a wealth of information.
Best computer science professor I ever had. And I didn't study at the University of Nottingham.
Yes, please do part 2. Cheers, Russ
As a software engineer, I could listen to Prof Brailsford all day long. If he ever decided to write a book, I would buy one for every coder I know.
its always a pleasure to see a new video from computerfile, especially with Professor Brailsford
I want this video! It's so incredibly interesting watching and learning about the history behind the individual bytes we often carelessly toss around with reckless abandon. Please make it!
I could listen to this man all day. I wish that I knew all that he knows. Thank you!
You should make a video on little/big endian
@philrod1
6 жыл бұрын
Kyle Horne - on KZread?! Are you mad? The comments will explode!
@mipmipmipmipmip
6 жыл бұрын
I once had the same software compiled on both a big and a little endian system. It used a binary format for configuration and outpuf files. What a horrible time that was.
@MrTridac
6 жыл бұрын
Oh, yeah. That would be awesome. I bet the Professor has some snarky comments on that mess.
@longlostwraith5106
6 жыл бұрын
mipmipmipmipmip A file system I'm working on currently has both big *and* little endian files inside it...
@Roxor128
6 жыл бұрын
Get Tom Scott in to rant about the headaches that go with it. It'll be up there with his ones about time zones and internationalisation.
0:38 "word-based" [dramatic snap zoom]
@HashimAziz1
4 жыл бұрын
Hahahaha
I could be listening to such a great story telling for days and days...
Prof' Brailsford brought up something interesting in that early computers were "word based." If there's anyway to do a video on this, or anything, please do. Unfortunately it wasn't touched in the rest of the video.
@w0ttheh3ll
4 жыл бұрын
actually, it's the key point of the rest of the video. what you're probably missing is that "word" refers to a specific amount of memory. for example, if a computer architecture has 32-bit registers, 32 bits is considered the word size of that computer. "word based" means that a word is the smallest amount of main memory that can be read from or written to. modern computers can usually address single bytes, allowing for a finer-grained memory access.
@RobBCactive
4 жыл бұрын
The word size is the natural integer and address size used by the CPU. Word based opposed to byte addressable relates to memory organisation. Both RISC & CISC computers prefer memory access aligned on natural sizes, every 4 bytes for int32 or 8 for int 64 for example. As register sizes and calculations can be done on even larger units eg) AVX 256 or 512, plus memory data access transfers occur in units of cache lines, the original concept of words became less useful. For example AMD64 cannot penalise use of 32bit integers over 64bit, as compatability with Win32 binaries was a key requirement.
Some word-oriented computers did have instructions for manipulating smaller units of data. For example, the 24-bit word ICL 1900 series used 6-bit characters packed 4 to a word. It had several instructions that used a special type of pointer to directly address characters within a word. Similarly the 36-bit word DEC PDP-10 (DECsystem10 and DECSYSTEM20) had some processors with the capability to handle arbitary length "bytes" packed within 36-bit words but most commonly used 6, 7, 8 and 9 bit "bytes".
The first big kit I used was a new Honeywell computer in 1980, which had nine-bit bytes, 36-bit words. And ran Multics, so we had a decent Algol 68 compiler. Nice :)
@captainz9
2 жыл бұрын
DecSystem20 (2060, actually a PDP-10) used 6-bit encoding for text, 36bit machine, 18-bit addressing. I learned a lot on that system, my first "big" machine after starting on Apple-II/TRS-80.
We demand! Another great video. Thank you for all the effort and research that goes into making this great content
I understood about 2/3 of Prof Brailsford's topic here, but I just have to watch these vids, because his knowledge, delivery and passion pull you in and carry you along
Thank you sir! Such knowledge along with such perfect manner of narration i always wanted for my education. And having been in IT for more that 25 years I learn from your videos! I'm deeply appreciate that. Please continue to educate us and even younger generations who may even be interested in basics. I bow my knee and head! Cheers from Russia!
this guy's voice is so relaxing. I could listen to him talk about bytes and computing history all day!
Hahaha. Wow. I wish Brailsford could have lectured my professors a while ago on how this actually hangs together, so they wouldn't have docked me a full grade for not just parroting the story in the textbook.
Clear, byte-sized description of their source. Thanks.
Your stories are great, thank you very much for talking time to share with us.
The cliffhangers are real. Awesome presentation.
This man sounds really pleasant. Half of the time I have no idea what he is talking about, but his voice and manner of speech is really soothing lol
I just read a thing in which "pieces of 8", is called bits. (that's an old spanish coin that often was divided into smaller bits.)
@allanrichardson1468
4 жыл бұрын
That’s where expressions like “two bits” for 25 US cents came from. Of course, the professor is British, so he wouldn’t mention that.
Yes, do a video on how IBM saw personal computing but then had to give up the game. BTW, brilliant videos all! I came into computing in the 60s and 70s via numerical analysis and Lie group, rotation group and JTFA mathematics, as well as good level nonlinear, dynamical systems analysis and hearing you tail about all of this - and explain it very clearly - is a real joy. You remind me of one of my early mathematics teachers from high school (yes, I grew up in the US), so thank you!
Should make the sequel.... Waiting desperately for it... 👍
Whoever did the captions is a world class perfectionist
Thank you for offering your content in 4K resolution!
One way you can see the shift to byte-based architectures is the supplanting of octal (base 8) with hexadecimal (base 16) as the prevalent compact way to write binary numbers. An octal digit is three bits, which isn't that convenient if your memory is organized into bytes-- a long octal number will have digits split into contiguous bytes. But a hexadecimal digit is exactly four bits, so two of them fit into a byte exactly. I remember my dad in the 70s bringing home core dumps out of the trash bin for me to scribble on the blank backs of, and these dumps were always pages and pages of octal numbers, just digits from 0 through 7. The machines he was working on used a 36-bit word, a multiple of 3, so it split nicely into octal digits. But in the microcomputer era everyone at that end was using "hex" instead: 0-9 mixed with the letters A-F to represent the digits zero through 15. Much more convenient if you dealt in bytes. DEC's machines are interesting. The PDP-1 and several others in the PDP line used an 18-bit word, the PDP-8 was 12-bit (both multiples of 3, and divisors of that nice 36-bit length he mentioned in the video). Naturally you'd use octal to write constants at this width. The PDP-11 was actually a 16-bit machine, but DEC people still used octal for everything at that point, which must have been a little awkward. I guess it was tradition. You'll notice that the C language (and its many descendants) seems to slightly privilege octal over hexadecimal--you just need to preface an octal constant with "0" whereas hexadecimal constants take "0x".
@thomasstambaugh5181
2 жыл бұрын
I was a hardware engineer at Digital from 1974 through 1982, and I worked in the PDP-11 systems engineering group. At CMU, where I got my degree, a PDP-8 was in the main EE lab on campus. So far as I know, it was the only computer on campus that undergraduates could actually touch -- as in start, stop, and toggle-in code (including the boot sequence). When I first got to Digital, there were four PDP-11 flavors -- each designed and built by a different designer. Digital created the VAX-11 as a 32-bit counterpart to the 16-bit PDP 11s. While I enjoyed this video, I have a different perspective on this history -- colored in no small part by my own history. It isn't that anything here is incorrect, it is instead that a lot was going on in the 1970s that isn't mentioned here. There is no mention of Digital, Prime, Data General, and so on. The first "personal computer" was the PDP-8 -- intentionally designed and built to be used by a single person. Digital arguably created the entire "low end" market by intentionally selling products where the price of each component was under $50K (some said $30K). This was not an arbitrary choice -- it was the threshold above which prospective customers had to capitalize the buy. The Digital business model was to bypass purchasing departments -- and therefore sidestep the dominance of IBM that this video talks about -- and instead sell directly to lab managers. I'm ignoring the "OEM play" that was the other big invention of Digital. In the early years of Digital, the PDP-8 had four 3-bit bytes per word while the PDP-11 had four 4-bit bytes per word. The PDP-11 was always byte-addressable. Finally, this video fails to mention another design choice of enormous import -- how are bytes indexed within a word? Is byte 0 the most significant or least significant byte? Anybody who's coded in C -- especially the C of the late 1970s -- is accustomed to worrying about "Endedness".
I'm currently watching this video to better understand bytes so that I can properly translate it into my Native American language (Navaho). It will be crucial for my efforts on making a Navaho word for qubit. This is side project on quantum hardware engineering in Navaho. Many thanks for your videos.
Great video and awesome explanation.
That enthusiasm and passion about computers should be emulated! Great discussion. What I learned? Bit is like the atom of computers.
5:31 Slight irony in IBM’s dominance, though: the most popular programming language for commercial applications was COBOL, which IBM wasn’t really keen on. It kept trying to push its customers to use its own “Commercial Translator”, but they kept pushing back and insisting on COBOL. So finally it had to give way. COBOL was also one of the first “portable” (or at least, largely portable) languages. So a nice side effect of its dominance was that it weakened vendors’ lock-in of their customers.
You remind me of The Sir David Attenborough by your narration. Every time I hear you, I am LOST...! Thank You so much
I've been reading about computer history lately, so I will join in requesting the follow up on IBM's first attempt at the personal computer
Professor Brailsford is like the nerd grandad I never had.
I had to write a sub program in PL/1 to multiply a ratio with 6 decimal places (stayed that same) by a value with 4 decimal places that went from 0.0001 to 9,999,999,999,999.0000 (this was a currency value). Took me months to get it working - and the solution was so easy in the end......
After I left college, probably mid-1990's I was looking at the case of using three values for a digit instead of two. Binary uses 1/0, on/off, +3v/0v. So worked some math and saw +/0/- had some very nice elegance. you could increase the bit digit density at a factor of 2 to 3, (2^3 ~ 3^2). Later I did some yahooing and saw that someone else had done circuitry, but he left his notation as 0/1/2.
Another great video from Professor
In followup(s) you could also talk about EBCDIC and how it came about i.e. evolution from numerical cards, 9+9+8 avoiding adjacent holes as I have only recently learnt
having watched a lot of these videos in the past weeks, hardcore seems to be one of brailsford's favorite words.
There is demand! We demand an explanation!
Kudos to Computerphile for preserving the knowledge.
I have read this in articles. Took so much more time to learn it and even though I knew most of the stuff, I am pleased to find some more interesting info. Very well explained! Tnx :)
Incredibly informative. Thank you!
Love your histories Prof Brailsford
A remarkable person. Like his humoerous way to tell things and edding a whole bunch of experience.
Excellent Presentation!!
My word KZread knows what I like. This was so damn interesting. Definitely subscribing and looking forward to watching the other videos.
Yes, please continue the story about IBM 8 bit machines...fascinating.
I'm just beginning to know about computers...I aspire to one day understand every word this man says.
The hardware-side is important too. For addressing memory, for each power of 2 you get a dedicated wire which turns on and off - that can be processed effectivly with simple gates.
I see a new video with Professor Brailsford, and I *immediately* have to click on it. I so look forward to his videos!
Prof Brailsford is one of the few folks on youtube that speaks fast enough to stop me from putting it at 1.5x
As always a very good video! I love to hear about the historical facts. Could you possibly make a video about Konrad Zuse and his Z series of (to be) computers?
I've always wondered why the Kilobyte (1000/1024 bytes) was decided upon as the next amount of bytes worth naming? ~1000 bytes seems an arbitrary amount of bytes to be concerned about in a binary world. Is it just the spectre of base10 hanging over the binary world? As usual, Professor Brailsford is an absolute pleasure to watch.
@MRCAGR1
6 жыл бұрын
Adam Feather IIRC it is the nearest power of 2 to 1000. 1000 in SI nomenclature is kilo, hence Kilo is used in computing parlance, with an upper case k to distinguish it from 1000. Likewise mega is used as the prefix to represent 10^6, the binary equivalent is 1024^2 again with an uppercase m. Coincidentally 1024 is 2^10.
@Adam-ce5mq
6 жыл бұрын
Thanks for taking the time to reply. It feels like you're reiterating my intuitions on why ~1000 bytes was where the stick was put in the ground to signify the first quantity of bytes we consider of import: we're from a base10 world and already think in terms of thousands. 1000 just feels like an arbitrary number of bytes to put the the next label on from a purely base2 perspective.
Wow! What a narrator! What a talent!
I wish he was my grandpa, literally would sit there and listen to all this stories. ❤
Keep in mind that IBM's 2nd generation systems - like the 1401 - used a 6 bit character via the Binary Coded Decimal (BCD) method. It was not until the 3rd generation 360 systems that the 8 bit byte became standard and characters were encoded using EBCDIC - Extended Binary Coded Decimal Interchange Code. Sorry but the 650 business systems predate even me! Starting with the 360 series, IBM offered the 'Decimal Arithmetic' feature that enabled the encoding to nibbles into one byte. Where the value 12345+ would be stored internally as HEX 'F1F2F3F4F5' in character mode (think COBOL ' PICTURE 99999') but as packed decimal it would be stored as HEX '12345F' .
I will happily pursue a career in computer engineering if every computer engineer spoke with such eloquence.
11:10 there's a demand, please go on :)
I’d give Professor Bailsford multiple bottles of fine red wine and enjoy letting him talk all the day 😀
Such a brilliant presentation... I need the second video, where is it ??
0:56 "Waffly bits of text" this is too funny.
@gordonrichardson2972
6 жыл бұрын
Funny, but true! Early computers were very expensive. You don't spend $10,000+ to do word processing...
9:12 DEC were able to do the same sort of thing, a little bit later, at a fraction of the cost: the PDP-11 was their first byte-addressable machine.
Subtitles imperfect (some words missing, spelled incorrectly) but otherwise very detailed. True CC (Closed captions) experience, thank you.
Great story. waiting for the end
Prof. Brailsford is the Dr. Poliakof of computing.
Thank you very much for this video! It is very interesting and educational :)
The David Attenborough of Computer Science
@BrianJ001
5 жыл бұрын
prof knows what he is talking about - David Attenborough is a champagne Socialist with a Climate Catastrophic mantra. Wrong as wrong can be!
Way I know it is a byte is eight bits. And everything in my world is based on that - KB, MB, GB, PB. And yeah, BCD is a biggie too - how my TRS-80 with an 8 bit Z-80 could represent 65536 with decent precision. Now of course on a 64 Bit PC laptop where max addressable by 64 bits is 18446744073709551616. Big difference there. And micros character sets have been 5, 6, 7, and 8 bits in the past. No idea what they are now - but if Windows alt sequences are any indication Unicode uses lots more bits.
What a great story teller!
Wonderful, just wonderful!
"Hardcore numerical operations." ~ Prof. Brailsford 2k18
This man is a walking encyclopedia of computing knowledge. I would love to have a beer or two with him.
Boredom was nibbling me away when i started searching youtube a bit and discovered this. My very first reaction was to click for a byte of this information.
When I started programming a computer, it was an IBM discrete transistor, 1401, in which addressing was decimal. There were no fixed words and instructions were delimited by word marks. The 8 bit byte consisted of a 6 bit BCD character, word mark bit and parity bit. When I started programming 360 series computers, the idea of EBCDIC characters and words of 2 bytes seemed a bit alien.
Awesome video!
9:00 “Sheer speed” of IBM’s hardware!? Somewhere, Seymour Cray is going “HAHAHAHAHAHAHA!!!”
Tell us more about the quantum chemistry work! That sounds awesome.
Dr. Brailsford is awesome. +++
I love these stories.
This man makes me so confused at first watch, and mind blown at second watch.