LuaTahi
Pages: 1 2
Rick Murray (539) 13851 posts |
I would expect two things. Firstly, for the functions written by each to be defined as to what sort of data types they expect as parameters (I could, for instance, do all of my Wimp window positions using double if I wanted, so long as I cast to the correct integer type before calling the SWI veneers; it would be ridiculous, but it would be possible) and to not have this arbitrarily altered by somebody else (remember the Acorn pink dinosaur?); and secondly for the project manager to intervene and make a decision if these two keep on going at it over which is the most appropriate type. Ultimately the choice of type depends upon the machine, the project, and the purpose…not ego, belief, or cargo cult. |
Steffen Huber (91) 1953 posts |
It is the minimum amount of bits, not the exact amount of bits. If it was easy, it wasn’t C.
And indeed C spec does not guarantee that a byte consists of eight bits. ISTR DEC being quite fond of 9 bit bytes and 36 bit words. It is complicated. But at least later C standards try to give the software developer a chance of better controlling the environment. |
GavinWraith (26) 1563 posts |
should have been SICP. The point is that the project manager does not have to arbitrate between two alternatives. So long as the rest of the program deals with widgets abstractly, there can be interfaces between it and Ander’s contribution and also between it and Bob’s. The manager provides a specification for the widget library, but only at an abstract level. So long as Ander’s and Bob’s contributions satisfy the specification, the implementation details are irrelevant. |
Rick Murray (539) 13851 posts |
One of the reasons I don’t do geekery for a living. The specification is sacrosanct, even when it is wrong, nonsense, or the hardware provides an easier simpler solution. Example? 6502 based design for an embedded thingy. This was the days before PICs and ATmega. Number A was only ever going to be a power of two. Number B was as well. To determine a result, number A needed to be multiplied or divided by number B. Now the clever ones among you will realise that this can be achieved with a simple shift operation, something provided by every halfwit processor since forever. Yet that infernal specification stated “multiply by” and “divide by” so we found ourselves obliged to use a lengthy mathematical routine to do what should have taken a couple of instructions and maybe a dozen cycles tops. The whole endeavour was endless amounts of that, and it became quite clear that the spec was written by somebody who didn’t understand how a processor works, yet was utterly unwilling to have any input from those who did. The 6502 has some nice freaky addressing schemes (not to mention all the fun of page zero based instructions)…that we couldn’t use thanks to a spec that had a memory arrangement that was so bad it would have shamed a toddler. Indeed the only input we managed to have was about the placement of the stack. The original spec didn’t mention this at all (I seem to recall it referred to a “cupboard” in which things could be thrown) and the pencil necks freaked out over the fact that this bit of memory couldn’t be used for such and such a purpose. Their reaction? To rewrite huge amounts of the spec changing things which meant we had to go through the code to ensure that it correctly followed the new spec. Shortly after I quit and went to work for an estate agent making photocopies and designing property sheets using Ovation. Much more fulfilling.
If somebody is intent on using one type of variable and somebody else is intent on using another, and the only interfacing between them has a lot of hidden casting between the types, I’d say the implementation details should be extremely relevant should anybody bother to check. It’s things like that (and endless use of RPN in variable names) that make maintaining a project something of a nightmare, especially if you weren’t the one to make that mess in the first place.
This is probably the insane baggage that is lurking in C’s history. Modern desktop machines use eight bit bytes, sizeof says a char is a byte in size and limits says a char is at least eight bits. This leaves leeway for crazy old while pretty much saying “a char is an eight bit byte on any machine you’re going to be able to afford to plonk on your desk”… |
GavinWraith (26) 1563 posts |
I think I should keep my mouth shut, seeing that I have zero experience of business. My impression is that basically the reason for most failures is bullying, desire to control and not listening. |
Steve Pampling (1551) 8172 posts |
It’s not a case of losing track of numbers, I simply stopped counting the instances of some over-promoted out-of-depth personage stating how I needed to do something. I did on one occasion explode a little: "If I’m the expert here as you say, will you shut the f1 up and leave me to do my expert job the way I know and you don’t? 1 Actually used the full word, he leant back from the “blast” and I winced internally. |
Pages: 1 2