What is meant by an "integral"? Is it any type that is stored as a whole integer number, so no floats, double & L double (the non-integrals).
Book just used literal, but did not fully explain & used the examples below. BUT I read further and find out that bool & char are considered integral types too? Bool because originally in C it was a unsigned short int. Also, char because it is an ASCII value. What about strings, sure they are classes, but they are arrays of char's..so they are probably integral too, right?
Incidentally, I read the first chap & little bit of 2nd chap of 2nd book (Beginning C++20 from Novice to Master Overtimer) and it too used "integral" when referring to integers,but did not fully explain either...maybe down the road. It was clear though that it was referring to integers, but I still had to go to outside source to see what bool & char were really considered.
So, at the end integral is all of the types, except for float, double, L double, & any type that has a fraction?
unsigned long long int for integral literal
long double for floating point literal
char, wchar_t, char16_t, and char32_t for character literal
const char* for raw string literal
const char* together with size_t for string literal
Why is a switch-case construct limited to only integer & char? Why would they implement this limitation? You can compare if (3.14 < 3.1415) just fine, so why not for switch?
Yes, I saw that both return the same value when first encountering them, and so I just moved on. But my C++ senses are tingling too loudly now.
"Returns the length of the string, in terms of bytes."
Now, I don't know anything about programming UTF encoding, but I know they exist and that sometimes a character can be represented as more than one byte (8 bits). So, now please tell me that.length() should really just report the length of the string (without \0), independent of the encoding, & as I really thought it should? Why have both size & length report the same? Is there ever a time when those two values will not be the same?
My 1st book used "size_t" and did not explain it and then I saw it used in a few places online. Find out it is just an unsigned int and sources said it is 4 bytes. I verified it was 4 at the time and one fine day all of a sudden it was 8. I was too new at the time, but thought it had something to do with certain compilers/computers allocation of memory.
Now I realize that I am on x64 PC and my debug compiler was set to x86 and when it read 8 bits it was set to x64.
"size_t is an unsigned integral type" //From link above.
But this is not true on x64, because sizeof(unsgined int) = 4 bytes there, but size_t = unsigned long long int (8 bytes).
So I guess doc should be???:
"size_t is an unsigned integral type" //For x32/x86
size_t is an unsigned long long int //For x64