如下代码可以运行吗?
01
代码语言:javascript复制__sbit __at (0x87) LED = 0 ;
__sbit __at (0x95) LS0 = 0 ;
__sbit __at (0x96) LS1 = 0 ;
__sbit __at (0x97) LS2 = 0 ;
void main()
{
}
运行结果是啥?
对比以下:
02
代码语言:javascript复制__sbit __at (0x87) LED = 1 ;
__sbit __at (0x95) LS0 = 0 ;
__sbit __at (0x96) LS1 = 0 ;
__sbit __at (0x97) LS2 = 0 ;
void main()
{
}
二进制文件如下:
01
代码语言:javascript复制:03000000020006F5
:0B005F00C287C295C296C29702000340
:0300030002006A8E
:01006A002273
:06003500E478FFF6D8FD9F
:200013007900E94400601B7A0090006F780175A000E493F2A308B8000205A0D9F4DAF27524
:02003300A0FF2C
:20003B007800E84400600A790175A000E4F309D8FC7800E84400600C7900900001E4F0A3C3
:04005B00D8FCD9FAFA
:0D00060075810712006BE5826003020003A4
:04006B007582002278
:00000001FF
02
代码语言:javascript复制:03000000020006F5
:0B005F00D287C295C296C29702000330
:0300030002006A8E
:01006A002273
:06003500E478FFF6D8FD9F
:200013007900E94400601B7A0090006F780175A000E493F2A308B8000205A0D9F4DAF27524
:02003300A0FF2C
:20003B007800E84400600A790175A000E4F309D8FC7800E84400600C7900900001E4F0A3C3
:04005B00D8FCD9FAFA
:0D00060075810712006BE5826003020003A4
:04006B007582002278
:00000001FF
给个正常一点的程序:
代码语言:javascript复制#include<8052.h>
#define LSA P1_5
#define LSB P1_6
#define LSC P1_7
void EXint_Init(void);
void Delayms(unsigned int);
void Down2Up(int);
void Up2Down(int);
void EXINT0() __interrupt 0;
void EXINT1() __interrupt 2;
void main()
{
EXint_Init();
LSA=0;
LSB=0;
LSC=0;
}
void EXint_Init()
{
IT0=1;
IT1=0;
// IPH=0x40;
PX1=1;
EA=1;
EX0=1;
EX1=1;
}
void Delayms(unsigned int xms)
{
unsigned int i,j;
for(i = xms;i > 0;i--)
{
for (j = 110;j > 0;j--);
}
}
void EXINT0() __interrupt 0
{
Down2Up(3);
}
void EXINT1() __interrupt 2
{
Up2Down(3);
}
void Down2Up(int x)
{
int i, j;
unsigned char sel=0xfe;
for(i=0;i<x;i )
{
for(j=0;j<8;j )
{
P0 = sel;
Delayms(250);
sel =sel<<1;
}
sel=0xfe;
}
}
void Up2Down(int x)
{
int i, j;
unsigned char sel=0x7f;
for(i=0;i<x;i )
{
for(j=0;j<8;j )
{
P0 = sel;
Delayms(250);
sel=sel>>1;
}
sel=0x7f;
}
}
www.radford.edu/ibarland/Manifestoes/whyC isBad.shtml
代码语言:javascript复制Why C and C are Awful Programming Languages
Imagine you are a construction worker, and your boss tells you to connect the gas pipe in the basement to the street's gas main. You go downstairs, and find that there's a glitch; this house doesn't have a basement. Perhaps you decide to do nothing, or perhaps you decide to whimsically interpret your instruction by attaching the gas main to some other nearby fixture, perhaps the neighbor's air intake. Either way, suppose you report back to your boss that you're done.
KWABOOM! When the dust settles from the explosion, you'd be guilty of criminal negligence.
Yet this is exactly what happens in many computer languages. In C/C , the programmer (boss) can write "house"[-1] * 37. It's not clear what was intended, but clearly some mistake has been made. It would certainly be possible for the language (the worker) to report it, but what does C/C do?
It finds some non-intuitive interpretation of "house"[-1] (one which may vary each time the program runs!, and which can't be predicted by the programmer),
then it grabs a series of bits from some place dictated by the wacky interpretation,
it blithely assumes that these bits are meant to be a number (not even a character),
it multiplies that practically-random number by 37, and
then reports the result, all without any hint of a problem.
[Based on an example by M. Felleisen] In a world where programs control credit-card databases, car brakes, my personal finances, airplanes, and x-ray machines, it is criminal negligence to use a language with the flaws of C/C . Even for games, browsers, and spreadsheets, the use of C/C needlessly helps inflict buggy software on the everyday user.
(See also: a blog post by one Alex Gaynor, Modern C Won't Save Us.)
This is only one example of how C and C get some of the basics wrong. Even the authors of the definitive C Annotated Reference Manual (“ARM”) confess that there are problems with the basics (for example, “the C array concept is weak and beyond repair” [pg 212]). I highly recommend C ?? : A Critique of C for a detailed exposition of flaws (major and minor) of both C and C .
For a technical critique of C/C from systems/compiler perspective (about the inherent danger of "undefined behavior" and how it arises surprisingly often even in innocuous C/C programs) see this excellent series of blog posts.
A Bad Choice For Students; An Alternative
As a teacher who has tried teaching it, I find C/C is also a particularly poor choice of a first language to learn.
Understanding what C or C programs do requires additional, reasonably detailed knowledge of how the computer's memory system (e.g. heap vs. stack memory allocation; word alignment). This, by definition, is low-level; high-level languages (e.g. Mathematica, Java, Scheme, Python) let you focus on computing an answer rather than on details of how the language might implement your program. (C was never intended to be a high-level language, but rather a low-level language with some high-level features on top of it. Such a language has its place, but not as a general-purpose language.)
I'm not saying low-level programming is bad. But when learning how to program, the important thought process should be on how to take a problem-description to code, and not on how the machine stores bits. Low-level programming is very important for programmers who interface write device drivers, and for compiler writers, etc. But these applications account for a very small portion of all programs written. Let beginning programmers learn the fundamentals, and then those particular students who need it can take a later course in low-level programming.
Unlike some languages, C and C are extremely permissive about what is a legal program. This flexibility might be nice for professionals, but for beginners it just means that typos tend to cause mysterious behavior, rather than signalling errors. In my teaching experience, I've often seen students baffled for hours, because they accidentally used a comma somewhere, rather than a semicolon. They often flail for hours, randomly adding or removing keywords they've heard of, like static or public or &, and seeing if that happens to solve their problem. This sort of flailing doesn't help anybody learn, and it's the result of a language which assumes the programmer doesn't ever make mistakes or need help.
A language for students should flag advanced or ambiguous constructs as probable typos. For instance, it's not obvious that in i = v[i ], the final value of i is undefined [C ARM, p.46]. It's not difficult for a language to warn you if you write this, but no C compilers choose to.
Programming is a difficult task, learned over months and years. Object-oriented programming (the “ ” part of “C ”) is a more advanced topic which is important for larger programs, but is best taught after the fundamentals have been learned.
In Mathematica, two billion plus two billion is four billion. In Java, it's defined to always be -293 million (approx). In C/C , it's defined to be whatever answer gets returned, and will vary from machine to machine.
Similarly, an example from the Java Langauge Specification p. 308: “it is not correct that 4.0*x*0.5 [is the same as] 2.0*x; while roundoff happens not to be an issue here, there are large values of x for which the first expression produces infinity (because of overflow) but the second expression produces a finite result.” (Again, the Java spec at least defines what the answer should be in all cases, unlike C where this is left to vary between platforms.)
The point is not that there are good reasons why some languages choose (unlike Mathematica or Scheme) to use imperfect arithmetic, but rather that when teaching a student how to decompose a problem into functions and how to program effectively, it's purely a digression to have to talk about numeric issues stemming from the language's choice of internal representation of numbers. This approach encourages the view that programming is a low-level activity, contradicting 60 years of working towards higher-level languages.
C is a large language, with many features, and requiring many statements in beginner programs whose meaning is inscrutable to the beginner.¹ (C has 68 operators, with 18 levels of precedence²; compare to Scheme, which has no levels of precedence, no needless distinction between function and operators, instead using parentheses consistently to mean “call a function”. Learning about all these levels has nada to do with the problem which the program is trying to solve.) The C standardization committee itself admits [X3J16 92] “C is already too large and complicated for our taste.” (Compare to Scheme, which has zero operators — everything is a function, and beginners don't waste time wondering if a certain construct is allowed in a certain context, or get surprised by precedence rules.)
For an alternative, check out the first few slides of talk showing how to teach AP CS with minimal C . (The third slide, “ketchup and caviar” is a gem!) You can get a free curriculum (which is the what the Rice University intro curriculum is based on).
So why is C so prevalent?
Given these known flaws with C/C , why is there the popular misconception — among too many programmers and public alike — that C is a good language? I genuinely am at a loss to explain it. But here's my suspicion: When C/C programmers, used to walking the tightrope without a net, see that a language like Java or Scheme is doing extra work (verifying that any additions really are given numbers instead of strings, making sure arrays indices are legal, etc.), their reaction is “ugh, the computer is doing so much extra work, my programs will run too slow!” Indeed, benchmark programs do run faster in C or C .
But there are a number of things to keep in mind: It is well-documented that development time is much longer in C/C , since bugs creep in more easily. Hence, cost is also higher for C/C programs. (Many C/C projects have never been completed because of obscure memory bugs.) I'd rather have a slower, correct program than one which finds a wrong answer more quickly :-).
Or even, how important is it to have fast programs? I don't know about you, but when I think about it, most of my wait-time behind the computer is due to my slow typing, or thinking, or waiting for info to download. I've spent much less time waiting for a calculation to finish than I have waiting for my computer to re-boot, or re-typing data which was lost because of a crash. (At the current moment, my netscape is unusable, complaining about pointer-based errors “invalid Pixmap” and “invalid GC parameter”. I'll have to try re-installing. Grrr.)
This is not to say that some applications require high performance — voice recognition, drafting, visualization of real-time CAT scan data, modeling star evolution or wind tunnels. Yes, C/C can sometimes give that performance better than other languages. And expert programmers using C/C ³ for those situations is fine. Indeed, taking prototype code and compiling or re-implementing them for efficiency is one of the prime goals of computer scientists. But such programming (and intensive debugging) is not the best place for the effort of an astronomer or medical researchers. Myself, I rarely or never run those types of programs; most of my time waiting on the computer is waiting for a page to download, not a slow program.
After talking repeatedly with people who tout C 's run-time efficiency while dismissing its lack of safety, I've seen that they often have a couple of other attitudes:
First, that bugs and crashes are an acceptable or inevitable part of computers. This is an outright lie, and it is foisted off onto the public, who feel forced (for compatability reasons) to buy from only a few major software firms. The public becomes resigned to poorly-written products and crashes, vindicating the initial attitude.
Second, these people exhibit a form of programmer machismo: “Other people might need the computer to make safety checks as their program runs, but not me! I'm smarter and better than all those thousands of other (more experienced) programmers who've shipped bugs in their products.”
Writing large software systems bug-free is still a task the industry is learning. But having casual programmers learn C or C , instead of a high-level language, is not the answer!
Another example
For a much more detailed argument on the shortcomings of C and C , see Ian Joyner's C ?? : A Critique of C , which includes examples of both flaws inherited from C and flaws introduced in C . For example, he correctly points out that constructs like:
// s1, s2 are char*
// (intended as strings, not ptrs to an individual char).
while (*s1 = *s2 );
might look optimal to C programmers, but are the antithesis of efficiency. Such constructs preclude compiler optimisation for processors with specific string handling instructions. A simple assignment is better for strings, as it will allow the compiler to generate optimal code for different target platforms. If the target processor does not have string instructions, then the compiler should be responsible for generating the above loop code, rather than requiring the programmer to write such low level constructs. The above loop construct for string copying is contrary to safety, as there is no check that the destination does not overflow, again an undetected inconsistency which could lead to obscure failures. The above code also makes explicit the underlying C implementation of strings, that are null terminated. Such examples show why C cannot be regarded as a high level language, but rather as a high level assembler.
You can certainly find supporters of C . But they tend to either misunderstand issues, or have a more relaxed attitude towards unnecessary bugs in commercial products. For instance, choosing a random page from the above link, we find the assertion
ANSI C makes type safety optional. C makes it mandatory. In C it is very difficult (not impossible) to violate the type system.
Excuse me? Type casting — which theoretically should be avoided, but is found all over C code (under the name of expediency: devotion to efficiency at the expense of correctness) — type casting annihiliates any pretense of type safety:
class Party { /* ...class details... */ };
class Trouble { /* ...class details... */ };
Party *invitation = (Party*) (new Trouble());
(a full example)
That wasn't so difficult after all: one routine assignment with a cast. Now C thinks that invitation points to a Party, when in fact it points to a Trouble. As soon as the programmer tries using the variable invitation, an error will occur. If they're lucky, the program will crash; if unlucky they'll proceed without knowing they are working with garbage. (This example is in no way atypical C/C code.) The above paper, after erroneously claiming C is “mostly” type-safe, does acknowledge that lack of type-safety is an anathema:
...type errors in C are often the causes of strange bugs that take weeks or months to find, and that exhibit transient and misleading behavior. They often foul the stack or heap and cause eventual failure several million instructions after the precipitating event. Such bugs are the hallmark of poor quality software.
Why use a language which admits the possibility of such “poor quality software”? C and C demonstrate that when a language allows poor practices, thousands of programmers will embrace such practices.
The above citation also asserts:
Why did C succeed? Because it cut through all the crap. Instead of fighting for “purity”, the authors of C created a language that could be used. They never contended that it was perfect or pure,
If by succeed, you mean “many programs were written in it, many had bugs which required a lot of effort before the program was released, and still contained significant flaws”, then sure. More insidiously, C's prevalence has been responsible for the culture of thinking that bugs are acceptable. Already mentioned is the additional costs in increased production, failed projects, and lost work. There are more dramatic examples; one that touched me personally was the internet worm of '88. It stopped all work in engineering departments around the country, since most mainframes were turned off for a couple days, while experts from around the country gathered in Berkeley to find the problem. The opening which the hacker exploited? A buffer overflow in finger, a core program run on most all UNIX computers. While this particular bug is from 1988, try right now a web search for “(buffer OR stack) overflow bug”; I just got 72,000 hits on pages which have discussing specific bugs of this one sort (2001.jan) (including serious security breaches in programs like Microsoft Messenger and AOL Instant Messenger, used by millions). All stemming from C's low-level “feature” of allowing arbitrary memory access w/o security checks. And there's just no reason this need be.
Mozilla guidelines to assure your C is portable. Note that one of Mozilla's software architects says:
[Abelson and Sussman] is absolutely the best book on the topic I've ever seen. By the time you make it halfway through this book, you will have a very firm grasp on what object oriented programming is, because that's what this book is about — programming. This book uses Scheme as its instructional language, but please don't let that put you off. Because this book teaches you programming, not a particular language, and that's the point that so many mediocre programmers manage to get through school without understanding — that 'object oriented' is a style of programming, not a property of some particular language. This book is about problem solving techniques, not about figuring out where the semicolons go.
Related article: High Tech Missionaries of Sloppiness, pointing out that “Being first to market with new products is exalted as the highest goal [in Silicon Valley], and companies fall back on huge technical support and customer service staffs to cope with their many errors of commission and omission.” They go on to argue that “someday soon, the computer industry of some foreign country that embraces [quality rather than speed-to-market] will do to its American competitors exactly what Japanese car makers did to Detroit.”
¹ Java, while much better than C , shares this same weakness: the smallest Java program requires about 12 keywords, each replete with meaning; a beginner must be told “put these words in your program in just this right order, else it won't work”. I've seen many students needlessly frustrated because it takes 30min to figure out their non-working program resulted from only inscribing eleven of the dozen necessary arcane glyphs. They may understand conceptually exactly what they want to do, but the arbitrary details of excessive syntax take out all the interest. (Some studies suggest that the prevalent teaching mode — encouraging arbitrary tinkering with little direction or meaning just trying to get it to work — is one reason for the prounounced gender bias seen in the field of computer science.)
Any teacher knows not to distract from a topic by introducing advanced details to a beginner. Common sense? You wouldn't know it from all the people who want to teach intro programming, but then use Java to do so. (back)
² For comparison, Java has 46 operators with 15 levels of precedence. (back)
³ Indeed, any professional programmer who uses C will tell you that they use a disciplined subset of it, since without self-imposed guidelines it's easy to hang yourself with some of the language's “features”. You have to wonder, when style-guides for major, experience projects includes many rules of the form “don't use feature X of the language”; it indicates that the community has learned what language features are more harmful than helpful. (back)
Ian's Home Page Please let me know
of any suggestions. Last modified 2019.Apr.25.