Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From that page: "At the moment, all implementations use 32-bit ints, an essentially arbitrary decision. However, we expect that int will be increased to 64 bits on 64-bit architectures in a future release of Go."

So, in other words, it's Amateur Hour. All righty, then.

/backs toward door, reaches for doorknob, still smiling and nodding



Fantastically dumb comment. FWIW int in C is also 32 bit on all mainstream 64 bit operating systems, and Go has int64.


Why on Earth would you want 64 bit ints by default, though? I'm not sure why they are thinking about changing it.


ints are used in the definition of some of the core interfaces. This has implications on how many entries a collection (that implements these interfaces) can contain.

For example, sort.Interface:

  package sort

  // A type, typically a collection, that satisfies sort.Interface can be
  // sorted by the routines in this package.  The methods require that the
  // elements of the collection be enumerated by an integer index.
  type Interface interface {
  	// Len is the number of elements in the collection.
  	Len() int
  	// Less returns whether the element with index i should sort
  	// before the element with index j.
  	Less(i, j int) bool
  	// Swap swaps the elements with indexes i and j.
  	Swap(i, j int)
  }

It would have implications on array indices as well, since they are also int.


Fantastically dumb comment. FWIW int in C is also 32 bit on all mainstream 64 bit operating systems, and Go has int64.

ROFL. I challenge you to find documentation for the C language or any other language commonly used in production where they speculate that they might wake up one day and change sizeof(int) on an existing platform.

"Fantastically dumb," indeed.


C is that language, among so many others, sizeof(int) is implementation specific, and indeed it does vary, though (S)?ILP64 is rare. On the other hand sizeof(long) varies all the time.

You seem very ignorant, please learn and stop the FUD.


You seem very ignorant

Yeah, that must be it.


C doesn't even define them, other setting a minimum. There have been C implementations where char was 64 bits.

As far as the documentation for the C langauge where they speculate that it's not defined, just given a minimum bound: here you go: Their implementation-defined values shall be equal or greater in magnitude (absolute value) to those shown, with the same sign (C99 spec, section 5.2.4.2.1)

It also allows sign-magnitude integers (complement 1) in addition to complement 2 (C99 spec, section 6.2.6.2)

It also doesn't define the number of bits in a byte (C99 spec, section 3.6 p2)


Holy @#$@, people. Did I wake up in a naïveté vortex?

No, C doesn't define sizeof(int). It is implementation defined. But you don't change it once you implement it in a given development environment... not if you want people to create and maintain production code with your tools. You don't speculate that one day you might want to change sizeof(int). It's just not something you do if you want to be taken seriously.

Is anyone in this thread over the age of 16?


GCC has options to change the size of int without even modifying the compiler. The size of integer types isn't even fixed within one implementation, in other words.

If you write production C code that depends on anything other than the minimum sizes defined in limits.h, your code is buggy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: