[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip / qa] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/g/ - Technology


Thread archived.
You cannot reply anymore.


[Advertise on 4chan]


File: stare5.jpg (34 KB, 512x512)
34 KB
34 KB JPG
>int getchar( void );
>int
>getCHAR

Fucking toy language.
>>
>>100187100
redditfrog please, how is getchar supposed to return a value that's not a char when it needs to signal an EOF? Englighten us.
>>
File: stare6.jpg (29 KB, 512x512)
29 KB
29 KB JPG
>>100187118
Should've made EOF a char??
>>
>>100187139
which char is supposed to become invalid value when all values of char are valid characters?
>>
>>100187139
This zoomer frog with the puffy hair is so funny :^)
>>
File: stare7.jpg (35 KB, 512x512)
35 KB
35 KB JPG
>>100187150
Why the fuck should I know? Ask boomers who came up with this retarded design. They could've used 0 for the EOF and 1 for null terminator (which is a retarded design by itself)
>>
>>100187187
>0 for EOF
0 is a valid character and therefore cannot be used to signal end of file
>>
>>100187100
What's the problem?
>>
File: stare9.jpg (35 KB, 512x512)
35 KB
35 KB JPG
>>100187204
It's only valid because retarded boomers said so.
>>
There are 256 possible values that can be stored in a char.
getchar needs to be able to return 257 possible values -- all values that fit in a char, plus EOF.
>>
File: stare10.jpg (23 KB, 512x512)
23 KB
23 KB JPG
>>100187260
> BEL (0x07 = \a), which causes a terminal to beep and/or flash.

Wonderful use of valuable space. I'm sure there are plenty of useless boomer idiotic characters that could've been used as EOF.
>>
>>100187276
Suppose I am writing a parser for an image format. I have piped the image into stdin for my parser to read. The image can contain values in the range 0-255 for each color channel of each pixel. Here, a char is not representing text but binary data. How are we to represent that there are no more pixels left to read?
>>
>>100187118
Wouldn't you just return EOT, which is 0x04?
>>
File: stare3.jpg (59 KB, 1024x960)
59 KB
59 KB JPG
>>100187311
> Here, a char is not representing text but binary data.

???????? CHAR should represent a CHARacter????? Why the fuck are cniles so retarded?
>>
>>100187238
it is always valid
>>
>>100187118
Why the fuck are you autists signaling inband? Whay is a "getchar" even concerned with an EOF?
>>
>>100187330
I do not know nor endorse use of getchar, it's your fault for wanting to do it.
>>
>>100187317
And chars are represented by numbers.
>>
>>100187343
> chars are represented by numbers.

Cniles read this shit and see nothing wrong.
Why would you have special 'CHAR' type then???
>>
>another dunning kruger frogposter thread
you're just stupid. go play videogames and don't make more threads
>>
>>100187367
Because it tells the compiler what to expect in that type so it can optimize the assembly it generates.
>>
>>100187317
The char data type is used for bytes. A character is just one type of byte. But there is no reason why char necessarily represents an ASCII-encoded character, or even any form of text data.
>>
>>100187367
This zoomer does not understand that all data is bytes. He thinks the computer magically writes letters into memory without using numbers
>>
File: smug.jpg (6 KB, 220x215)
6 KB
6 KB JPG
>While it shows how clever the likes of K&R were, you should probably be looking at something more ... newbie-friendly.
>>
>>100187411
> But there is no reason why char necessarily represents a char

Cniles read this and see nothing wrong.

Why the fuck would you have a type for a char then????????
>>
>>100187367
It's not special. It's a base integer data type. There are 5 base integer data types, plus the unsigned versions:

char (minimum 8 bits)
short (minimum 16 bits)
int (minimum 16 bits)
long (minimum 32 bits)
long long (minimum 64 bits)

Each platform is free to define the size of these data types provided that:
sizeof(char) <= sizeof(short) <= sizeof(int) <= sizeof(long) <= sizeof(long long).
>>
>>100187427
We have a type for bytes. We call it char, because the most common use for a single byte is to store a character. But it can store any one byte value.

By the way, the type of a character literal is int, not char.
>>
You don't understand, CHARs are special in memory, they aren't bits but actual text. If you examine the banks under the microscope you can see the individual teeny tiny letters.
>>
For the guy who insists characters are special... I hate to break it to you.
#include <stddef.h>

size_t this_function_returns_four(void)
{
return sizeof('a');
}
>>
File: ScreenshotTile.png (176 KB, 1080x2060)
176 KB
176 KB PNG
>>100187276
My vote is for form feed or vertical tab. Nobody uses that shit anymore.
>>
>we're using CHARACTER type for representing the bytes!
>why you couldn't name the type 'byte' and use the different type for characters?
>w-w-we just couldn't, okay??? SHUT UP!!
Toy language.


>>100187444
>type of a character literal is int, not char.
Fucking retarded toy language.
>>
>>
>>100187495
Here's something for ya: many people think byte means 8 bits. It doesn't -- that's what an octet is. A byte is the smallest addressable unit of memory on a machine... which is usually 8 bits, but some machines have word sizes of 18 or 36, and used 9 bit bytes. A char can be 9 bits on these platforms. Now imagine the confused retards screeching that the byte data type was sometimes 9 bits. Imagine having to explain to someone why we have a BYTE_BITS constant.
>>
>>100187118
Make another function called char eof(FILE*) that checks for EOF. Just call it before calling getchar.
>>
>>100187495
You are learning that conventions are done for our own sake to abstract away the fact that computers are just binary machines.
Now you just have to get over it.
>>
>>100187541
how about just read(2) and write(2) more than one byte and since now you get amount of bytes you read back into your buffer you already know if you ran out of bytes to read when read returns 0?
fread wraps these syscalls and it just works.
>>
>>100187568
Because that's antisemitism you fucking hamas chud.
>>
>>100187311
You definitely shouldn't use a char for that nigger, not only is the size of char left to implementation, it's also not it's fucking purpose. You use int_ for that, in your case 16 is enough.
>>
>>100187444
the type for bytes is uint8_t, char's size is implementation dependent and shouldn't be used to represent exactly a byte of data.
>>
>>100187276
>>100187475
*reserves thousands of codepoints for emoji and skin tone modifiers*
>>
>>100188477
sizeof(char) == 1 always
But yes, for byte of data you should use unsigned char.
>>
>>100188497
You can make that logical distinction, but uint8_t is always a typedef to unsigned char or char. Because char is at least 8 bits, and char is the smallest C type.

>>100187444
>By the way, the type of a character literal is int, not char.
Is there a difference to the "type" of the literal and the effect of integer promotion?
>>
>>100188517
those aren't ASCII tho
>>
>>100188523
See >>100187435
That is absolutely not true, C never enforces a specific size for it's primitives, hence why uint_ and int_ exists. I've already found issues with int not being the same fucking size in two different stdlibs.
>>
>>100188532
>oh no no 1 ASCII codepoint is wasted on vertical tab!!!!111111111
>leaves 128-255 undefined
>>
File: 1708559593884042.jpg (53 KB, 594x595)
53 KB
53 KB JPG
>>100188538
"sizeof(char) == 1" is always true, moron. Learn some fucking C. sizeof returns the size in units of char, not bytes.
>>
>>100188551
Nigger read the actual ISO definition of C, it is not guaranteed to be a byte or "1" unit, it happens that it's most common size but is by no means guaranteed
>>
>>100188538
>>100188551
Standard literally says so btw.

https://www.open-std.org/jtc1/sc22/wg14/www/docs/n1570.pdf
>>
>>100188575
What then about 2 byte char platforms? I can't see it guaranteeing to give a 1 and also letting the size of char up to implementation, something else must be missing.
>>
>>100188624
>2 byte
you mean CHAR_BIT == 16 and sizeof(char) == 1?
Byte is a nonstandard thing, just because your babyduck syndrome makes you think that byte means CHAR_BIT = 8, doesn't mean it's true, there used to be machines with retarded sizes like 13 bits per byte and it worked just fine.
>>
Rare thread where the frogposter is right
>>
>>100188643
Ah I see my mistake, so if you need 8bit you should use uint, but if you need a "byte" you can use the primitives. Still for reading a file you should use uint since most parsers assume 8bit byte
>>
>>100188683
on such platform uint8 won't ever be used, but normally if you use such platform you already know this and can optimize for case that each byte is 16 bits
>>
>>100188697
Anyways why do we need int getchar again? since char already goes from -127 to 127 and the ASCII table only has 127 characters?
>>
>>100188712
I don't know which tard came up with idea of reading one char at a time when it's never a good solution. A good solution is allocating big buffer and copying it into it, and an even better one is simply memory mapping a buffer so it's already transparently in memory, maybe it's slower at the start but you can use SIMD on incoming data without worrying about copying shit anywhere so it's good.
>>
File: byte.png (20 KB, 1273x234)
20 KB
20 KB PNG
>>100188575
I don't really like how the standard defines things.
>>
File: char2.png (60 KB, 1267x344)
60 KB
60 KB PNG
Here, finally it's implied that byte = a CHAR_BIT unit, which must mean that char is a byte, and a byte doesn't have to be 8 bits.
Convoluted crap. And I like C.
>>
File: sizeof.png (63 KB, 1257x447)
63 KB
63 KB PNG
>>
>>100188729
>>100188745
>>100188771
>byte is implementation defined collection of bits
how is this convoluted?
>>
File: char_bit.png (8 KB, 923x106)
8 KB
8 KB PNG
The CHAR_BIT minimum required value. It can be higher in theory, in practice only some obscure DSP platforms which don't support byte addressing set it higher. I think POSIX even requires that CHAR_BIT==8.
>>
File: char.png (35 KB, 1258x224)
35 KB
35 KB PNG
>>100188800
Why can't they just say that a char is a byte?
Why can't they say 7-bit ASCII instead of "basic execution character set"?
The C spec is way too abstract and roundabout.
>>
>caring about types in C
even python respects types more than C
>>
>>100188817
And of course, the answer is IBM
https://en.wikipedia.org/wiki/EBCDIC
>>
>>100188817
Well it's a lawyer tier language, always has been this way, and I don't make the rules.
Also everyone knows that ASCII is 7 bit, only a heretic would think otherwise, so noone needs to talk about it.
The most cringe thing about C is that char is actually implementation defined whether it's unsigned or signed, and while that's annoying, it simply is proof that ASCII is 7 bit because 8th bit is a sign.
>>
>>100188851
>The most cringe thing about C is that char is actually implementation defined whether it's unsigned or signed,
What's even more cringe is that "char", "signed char", and "unsigned char" are all distinct types (even if char has exactly the same range and semantics as one of the latter).
>>
>>100188864
in C they aren't, in C++ now we have 4 of them, 4th being neutered version of unsigned char called std::byte
>>
File: posix.png (12 KB, 518x118)
12 KB
12 KB PNG
Stop writing code that tries to respect CHAR_BIT!=8.
>>
File: posix2.png (25 KB, 1924x271)
25 KB
25 KB PNG
>>100188972
>>
>>100188972
>>100189007
already nobody writes such code, only microcotroller fags might be subject to this and they're on their own, not anyone's problem
>>
>>100189169
https://github.com/search?q=CHAR_BIT&type=code
>>
>>100187100
reminder that C use to be untyped garbage (it's still garbage).
>>
>>100189205
>>
>>100189273
>shithub login-walling basic features now
were they getting DoS'd or is this M$ just trying to drive fake engagement for some weird ass adware bullshit?
>>
>>100187100
std::cin >>std::noskipws
compile with gcc and ignore all autists screeching because you used c++
>>
>>100189442
I don't want to know.



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.