>int getchar( void );>int>getCHARFucking toy language.
>>100187100redditfrog please, how is getchar supposed to return a value that's not a char when it needs to signal an EOF? Englighten us.
>>100187118Should've made EOF a char??
>>100187139which char is supposed to become invalid value when all values of char are valid characters?
>>100187139This zoomer frog with the puffy hair is so funny :^)
>>100187150Why the fuck should I know? Ask boomers who came up with this retarded design. They could've used 0 for the EOF and 1 for null terminator (which is a retarded design by itself)
>>100187187>0 for EOF0 is a valid character and therefore cannot be used to signal end of file
>>100187100What's the problem?
>>100187204It's only valid because retarded boomers said so.
There are 256 possible values that can be stored in a char.getchar needs to be able to return 257 possible values -- all values that fit in a char, plus EOF.
>>100187260> BEL (0x07 = \a), which causes a terminal to beep and/or flash.Wonderful use of valuable space. I'm sure there are plenty of useless boomer idiotic characters that could've been used as EOF.
>>100187276Suppose I am writing a parser for an image format. I have piped the image into stdin for my parser to read. The image can contain values in the range 0-255 for each color channel of each pixel. Here, a char is not representing text but binary data. How are we to represent that there are no more pixels left to read?
>>100187118Wouldn't you just return EOT, which is 0x04?
>>100187311> Here, a char is not representing text but binary data.???????? CHAR should represent a CHARacter????? Why the fuck are cniles so retarded?
>>100187238it is always valid
>>100187118Why the fuck are you autists signaling inband? Whay is a "getchar" even concerned with an EOF?
>>100187330I do not know nor endorse use of getchar, it's your fault for wanting to do it.
>>100187317And chars are represented by numbers.
>>100187343> chars are represented by numbers.Cniles read this shit and see nothing wrong.Why would you have special 'CHAR' type then???
>another dunning kruger frogposter threadyou're just stupid. go play videogames and don't make more threads
>>100187367Because it tells the compiler what to expect in that type so it can optimize the assembly it generates.
>>100187317The char data type is used for bytes. A character is just one type of byte. But there is no reason why char necessarily represents an ASCII-encoded character, or even any form of text data.
>>100187367This zoomer does not understand that all data is bytes. He thinks the computer magically writes letters into memory without using numbers
>While it shows how clever the likes of K&R were, you should probably be looking at something more ... newbie-friendly.
>>100187411> But there is no reason why char necessarily represents a charCniles read this and see nothing wrong.Why the fuck would you have a type for a char then????????
>>100187367It's not special. It's a base integer data type. There are 5 base integer data types, plus the unsigned versions:char (minimum 8 bits)short (minimum 16 bits)int (minimum 16 bits)long (minimum 32 bits)long long (minimum 64 bits)Each platform is free to define the size of these data types provided that:sizeof(char) <= sizeof(short) <= sizeof(int) <= sizeof(long) <= sizeof(long long).
>>100187427We have a type for bytes. We call it char, because the most common use for a single byte is to store a character. But it can store any one byte value.By the way, the type of a character literal is int, not char.
You don't understand, CHARs are special in memory, they aren't bits but actual text. If you examine the banks under the microscope you can see the individual teeny tiny letters.
For the guy who insists characters are special... I hate to break it to you. #include <stddef.h>size_t this_function_returns_four(void){ return sizeof('a');}
#include <stddef.h>size_t this_function_returns_four(void){ return sizeof('a');}
>>100187276My vote is for form feed or vertical tab. Nobody uses that shit anymore.
>we're using CHARACTER type for representing the bytes!>why you couldn't name the type 'byte' and use the different type for characters?>w-w-we just couldn't, okay??? SHUT UP!!Toy language.>>100187444>type of a character literal is int, not char.Fucking retarded toy language.
>>100187495Here's something for ya: many people think byte means 8 bits. It doesn't -- that's what an octet is. A byte is the smallest addressable unit of memory on a machine... which is usually 8 bits, but some machines have word sizes of 18 or 36, and used 9 bit bytes. A char can be 9 bits on these platforms. Now imagine the confused retards screeching that the byte data type was sometimes 9 bits. Imagine having to explain to someone why we have a BYTE_BITS constant.
>>100187118Make another function called char eof(FILE*) that checks for EOF. Just call it before calling getchar.
>>100187495You are learning that conventions are done for our own sake to abstract away the fact that computers are just binary machines.Now you just have to get over it.
>>100187541how about just read(2) and write(2) more than one byte and since now you get amount of bytes you read back into your buffer you already know if you ran out of bytes to read when read returns 0?fread wraps these syscalls and it just works.
>>100187568Because that's antisemitism you fucking hamas chud.
>>100187311You definitely shouldn't use a char for that nigger, not only is the size of char left to implementation, it's also not it's fucking purpose. You use int_ for that, in your case 16 is enough.
>>100187444the type for bytes is uint8_t, char's size is implementation dependent and shouldn't be used to represent exactly a byte of data.
>>100187276>>100187475*reserves thousands of codepoints for emoji and skin tone modifiers*
>>100188477sizeof(char) == 1 alwaysBut yes, for byte of data you should use unsigned char.
>>100188497You can make that logical distinction, but uint8_t is always a typedef to unsigned char or char. Because char is at least 8 bits, and char is the smallest C type.>>100187444>By the way, the type of a character literal is int, not char.Is there a difference to the "type" of the literal and the effect of integer promotion?
>>100188517those aren't ASCII tho
>>100188523See >>100187435That is absolutely not true, C never enforces a specific size for it's primitives, hence why uint_ and int_ exists. I've already found issues with int not being the same fucking size in two different stdlibs.
>>100188532>oh no no 1 ASCII codepoint is wasted on vertical tab!!!!111111111>leaves 128-255 undefined
>>100188538"sizeof(char) == 1" is always true, moron. Learn some fucking C. sizeof returns the size in units of char, not bytes.
>>100188551Nigger read the actual ISO definition of C, it is not guaranteed to be a byte or "1" unit, it happens that it's most common size but is by no means guaranteed
>>100188538>>100188551Standard literally says so btw.https://www.open-std.org/jtc1/sc22/wg14/www/docs/n1570.pdf
>>100188575What then about 2 byte char platforms? I can't see it guaranteeing to give a 1 and also letting the size of char up to implementation, something else must be missing.
>>100188624>2 byteyou mean CHAR_BIT == 16 and sizeof(char) == 1?Byte is a nonstandard thing, just because your babyduck syndrome makes you think that byte means CHAR_BIT = 8, doesn't mean it's true, there used to be machines with retarded sizes like 13 bits per byte and it worked just fine.
Rare thread where the frogposter is right
>>100188643Ah I see my mistake, so if you need 8bit you should use uint, but if you need a "byte" you can use the primitives. Still for reading a file you should use uint since most parsers assume 8bit byte
>>100188683on such platform uint8 won't ever be used, but normally if you use such platform you already know this and can optimize for case that each byte is 16 bits
>>100188697Anyways why do we need int getchar again? since char already goes from -127 to 127 and the ASCII table only has 127 characters?
>>100188712I don't know which tard came up with idea of reading one char at a time when it's never a good solution. A good solution is allocating big buffer and copying it into it, and an even better one is simply memory mapping a buffer so it's already transparently in memory, maybe it's slower at the start but you can use SIMD on incoming data without worrying about copying shit anywhere so it's good.
>>100188575I don't really like how the standard defines things.
Here, finally it's implied that byte = a CHAR_BIT unit, which must mean that char is a byte, and a byte doesn't have to be 8 bits.Convoluted crap. And I like C.
>>100188729>>100188745>>100188771>byte is implementation defined collection of bitshow is this convoluted?
The CHAR_BIT minimum required value. It can be higher in theory, in practice only some obscure DSP platforms which don't support byte addressing set it higher. I think POSIX even requires that CHAR_BIT==8.
>>100188800Why can't they just say that a char is a byte?Why can't they say 7-bit ASCII instead of "basic execution character set"?The C spec is way too abstract and roundabout.
>caring about types in Ceven python respects types more than C
>>100188817And of course, the answer is IBMhttps://en.wikipedia.org/wiki/EBCDIC
>>100188817Well it's a lawyer tier language, always has been this way, and I don't make the rules.Also everyone knows that ASCII is 7 bit, only a heretic would think otherwise, so noone needs to talk about it.The most cringe thing about C is that char is actually implementation defined whether it's unsigned or signed, and while that's annoying, it simply is proof that ASCII is 7 bit because 8th bit is a sign.
>>100188851>The most cringe thing about C is that char is actually implementation defined whether it's unsigned or signed,What's even more cringe is that "char", "signed char", and "unsigned char" are all distinct types (even if char has exactly the same range and semantics as one of the latter).
>>100188864in C they aren't, in C++ now we have 4 of them, 4th being neutered version of unsigned char called std::byte
Stop writing code that tries to respect CHAR_BIT!=8.
>>100188972
>>100188972>>100189007already nobody writes such code, only microcotroller fags might be subject to this and they're on their own, not anyone's problem
>>100189169https://github.com/search?q=CHAR_BIT&type=code
>>100187100reminder that C use to be untyped garbage (it's still garbage).
>>100189205
>>100189273>shithub login-walling basic features nowwere they getting DoS'd or is this M$ just trying to drive fake engagement for some weird ass adware bullshit?
>>100187100std::cin >>std::noskipwscompile with gcc and ignore all autists screeching because you used c++
>>100189442I don't want to know.