forked from DeathKing/Learning-SICP
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathlec10a.eng.srt
1568 lines (1176 loc) · 46.1 KB
/
lec10a.eng.srt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1
00:00:05,580 --> 00:00:20,180
[MUSIC PLAYING]
2
00:00:20,180 --> 00:00:36,640
PROFESSOR: Last time, we took a look at an explicit control evaluator for Lisp, and that bridged the gap between all these high-level languages like Lisp and the query language and all of that stuff, bridged the gap between that and a conventional register machine.
3
00:00:36,640 --> 00:00:55,340
And in fact, you can think of the explicit control evaluator either as, say, the code for a Lisp interpreter if you wanted to implement it in the assembly language of some conventional register transfer machine, or, if you like, you can think of it as the microcode of some machine that's going to be specially designed to run Lisp.
4
00:00:55,340 --> 00:01:08,230
In either case, what we're doing is we're taking a machine that speaks some low-level language, and we're raising the machine to a high-level language like Lisp by writing an interpreter.
5
00:01:08,230 --> 00:01:23,910
So for instance, here, conceptually, is a special purpose machine for computing factorials.
6
00:01:23,910 --> 00:01:29,000
It takes in five and puts out 120.
7
00:01:29,000 --> 00:01:42,410
And what this special purpose machine is is actually a Lisp interpreter that's configured itself to run factorials, because you fit into it a description of the factorial machine.
8
00:01:42,410 --> 00:01:43,610
So that's what an interpreter is.
9
00:01:43,610 --> 00:01:50,120
It configures itself to emulate a machine whose description you read in.
10
00:01:50,120 --> 00:01:52,110
Now, inside the Lisp interpreter, what's that?
11
00:01:52,110 --> 00:02:03,410
Well, that might be your general register language interpreter that configures itself to behave like a Lisp interpreter, because you put in a whole bunch of instructions in register language.
12
00:02:03,410 --> 00:02:07,070
This is the explicit control evaluator.
13
00:02:07,070 --> 00:02:12,780
And then it also has some sort of library, a library of primitive operators and Lisp operations and all sorts of things like that.
14
00:02:12,780 --> 00:02:17,350
That's the general strategy of interpretation.
15
00:02:17,350 --> 00:02:25,430
And the point is, what we're doing is we're writing an interpreter to raise the machine to the level of the programs that we want to write.
16
00:02:25,430 --> 00:02:29,030
Well, there's another strategy, a different one, which is compilation.
17
00:02:29,030 --> 00:02:31,090
Compilation's a little bit different.
18
00:02:31,090 --> 00:02:47,870
Here--here we might have produced a special purpose machine for, for computing factorials, starting with some sort of machine that speaks register language, except we're going to do a different strategy.
19
00:02:47,870 --> 00:02:51,680
We take our factorial program.
20
00:02:51,680 --> 00:02:53,780
We use that as the source code into a compiler.
21
00:02:53,780 --> 00:02:59,926
What the compiler will do is translate that factorial program into some register machine language.
22
00:02:59,926 --> 00:03:06,760
And this will now be not the explicit control evaluator for Lisp, this will be some register language for computing factorials.
23
00:03:06,760 --> 00:03:10,460
So this is the translation of that.
24
00:03:10,460 --> 00:03:19,970
That will go into some sort of loader which will combine this code with code selected from the library to do things like primitive multiplication.
25
00:03:19,970 --> 00:03:28,320
And then we'll produce a load module which configures the register language machine to be a special purpose factorial machine.
26
00:03:28,320 --> 00:03:29,905
So that's a, that's a different strategy.
27
00:03:29,905 --> 00:03:35,360
In interpretation, we're raising the machine to the level of our language, like Lisp.
28
00:03:35,360 --> 00:03:42,040
In compilation, we're taking our program and lowering it to the language that's spoken by the machine.
29
00:03:42,040 --> 00:03:44,280
Well, how do these two strategies compare?
30
00:03:44,280 --> 00:03:50,140
The compiler can produce code that will execute more efficiently.
31
00:03:52,490 --> 00:04:10,260
The essential reason for that is that if you think about the register operations that are running, the interpreter has to produce register operations which, in principle, are going to be general enough to execute any Lisp procedure.
32
00:04:10,260 --> 00:04:20,209
Whereas the compiler only has to worry about producing a special bunch of register operations for, for doing the particular Lisp procedure that you've compiled.
33
00:04:20,209 --> 00:04:31,160
Or another way to say that is that the interpreter is a general purpose simulator, that when you read in a Lisp procedure, then those can simulate the program described by that, by that procedure.
34
00:04:31,160 --> 00:04:40,000
So the interpreter is worrying about making a general purpose simulator, whereas the compiler, in effect, is configuring the thing to be the machine that the interpreter would have been simulating.
35
00:04:40,000 --> 00:04:41,340
So the compiler can be faster.
36
00:04:52,830 --> 00:04:59,340
On the other hand, the interpreter is a nicer environment for debugging.
37
00:04:59,340 --> 00:05:02,960
And the reason for that is that we've got the source code actually there.
38
00:05:02,960 --> 00:05:03,740
We're interpreting it.
39
00:05:03,740 --> 00:05:06,010
That's what we're working with.
40
00:05:06,010 --> 00:05:07,880
And we also have the library around.
41
00:05:07,880 --> 00:05:11,140
See, the interpreter--the library sitting there is part of the interpreter.
42
00:05:11,140 --> 00:05:14,830
The compiler only pulls out from the library what it needs to run the program.
43
00:05:14,830 --> 00:05:29,670
So if you're in the middle of debugging, and you might like to write a little extra program to examine some run time data structure or to produce some computation that you didn't think of when you wrote the program, the interpreter can do that perfectly well, whereas the compiler can't.
44
00:05:29,670 --> 00:05:31,850
So there are sort of dual, dual advantages.
45
00:05:31,850 --> 00:05:34,720
The compiler will produce code that executes faster.
46
00:05:34,720 --> 00:05:39,030
The interpreter is a better environment for debugging.
47
00:05:39,030 --> 00:05:46,930
And most Lisp systems end up having both, end up being configured so you have an interpreter that you use when you're developing your code.
48
00:05:46,930 --> 00:05:49,060
Then you can speed it up by compiling.
49
00:05:49,060 --> 00:05:54,810
And very often, you can arrange that compiled code and interpreted code can call each other.
50
00:05:54,810 --> 00:05:55,700
We'll see how to do that.
51
00:05:55,700 --> 00:05:56,950
That's not hard.
52
00:06:01,040 --> 00:06:14,320
In fact, the way we'll-- in the compiler we're going to make, the way we'll arrange for compiled coding and interpreted code to call, to call each other, is that we'll have the compiler use exactly the same register conventions as the interpreter.
53
00:06:18,680 --> 00:06:25,490
Well, the idea of a compiler is very much like the idea of an interpreter or evaluator.
54
00:06:25,490 --> 00:06:27,070
It's the same thing.
55
00:06:27,070 --> 00:06:33,840
See, the evaluator walks over the code and performs some register operations.
56
00:06:33,840 --> 00:06:37,040
That's what we did yesterday.
57
00:06:37,040 --> 00:06:48,890
Well, the compiler essentially would like to walk over the code and produce the register operations that the evaluator would have done were it evaluating the thing.
58
00:06:48,890 --> 00:06:58,330
And that gives us a model for how to implement a zeroth-order compiler, a very bad compiler but essentially a compiler.
59
00:06:58,330 --> 00:07:07,550
A model for doing that is you just take the evaluator, you run it over the code, but instead of executing the actual operations, you just save them away.
60
00:07:07,550 --> 00:07:08,820
And that's your compiled code.
61
00:07:08,820 --> 00:07:10,140
So let me give you an example of that.
62
00:07:15,130 --> 00:07:18,010
Suppose we're going to compile--suppose we want to compile the expression f of x.
63
00:07:25,100 --> 00:07:30,170
So let's assume that we've got f of x in the x register and something in the environment register.
64
00:07:30,170 --> 00:07:31,745
And now imagine starting up the evaluator.
65
00:07:34,560 --> 00:07:38,000
Well, it looks at the expression and it sees that it's an application.
66
00:07:38,000 --> 00:07:44,980
And it branches to a place in the evaluator code we saw called ev-application.
67
00:07:47,230 --> 00:07:48,190
And then it begins.
68
00:07:48,190 --> 00:07:54,410
It stores away the operands and unev, and then it's going to put the operator in exp, and it's going to go recursively evaluate it.
69
00:07:54,410 --> 00:07:56,385
That's the process that we walk through.
70
00:07:56,385 --> 00:08:00,200
And if you start looking at the code, you start seeing some register operations.
71
00:08:00,200 --> 00:08:06,770
You see assign to unev the operands, assign to exp the operator, save the environment, generate that, and so on.
72
00:08:10,310 --> 00:08:20,860
Well, if we look on the overhead here, we can see, we can see those operations starting to be produced.
73
00:08:20,860 --> 00:08:24,910
Here's sort of the first real operation that the evaluator would have done.
74
00:08:24,910 --> 00:08:34,740
It pulls the operands out of the exp register and assigns it to unev. And then it assigns something to the expression register, and it saves continue, and it saves env.
75
00:08:34,740 --> 00:08:42,010
And all I'm doing here is writing down the register assignments that the evaluator would have done in executing that code.
76
00:08:42,010 --> 00:08:44,280
And can zoom out a little bit.
77
00:08:44,280 --> 00:08:49,430
Altogether, there are about 19 operations there.
78
00:08:49,430 --> 00:08:57,940
And this is the--this will be the piece of code up until the point where the evaluator branches off to apply-dispatch.
79
00:08:57,940 --> 00:09:01,450
And in fact, in this compiler, we're not going to worry about apply-dispatch at all.
80
00:09:01,450 --> 00:09:06,160
We're going to have everything--we're going to have both interpreted code and compiled code.
81
00:09:06,160 --> 00:09:10,240
Always evaluate procedures, always apply procedures by going to apply-dispatch.
82
00:09:10,240 --> 00:09:13,970
That will easily allow interpreted code and compiled code to call each other.
83
00:09:18,330 --> 00:09:21,220
Well, in principle, that's all we need to do.
84
00:09:21,220 --> 00:09:22,620
You just run the evaluator.
85
00:09:22,620 --> 00:09:24,320
So the compiler's a lot like the evaluator.
86
00:09:24,320 --> 00:09:29,480
You run it, except it stashes away these operations instead of actually executing them.
87
00:09:29,480 --> 00:09:32,680
Well, that's not, that's not quite true.
88
00:09:32,680 --> 00:09:36,370
There's only one little lie in that.
89
00:09:36,370 --> 00:09:40,480
What you have to worry about is if you have a, a predicate.
90
00:09:40,480 --> 00:09:51,400
If you have some kind of test you want to do, obviously, at the point when you're compiling it, you don't know which branch of these--of a conditional like this you're going to do.
91
00:09:51,400 --> 00:09:55,010
So you can't say which one the evaluator would have done.
92
00:09:55,010 --> 00:09:57,190
So all you do there is very simple.
93
00:09:57,190 --> 00:09:58,985
You compile both branches.
94
00:09:58,985 --> 00:10:02,050
So you compile a structure that looks like this.
95
00:10:02,050 --> 00:10:18,140
That'll compile into something that says, the code, the code for P. And it puts its results in, say, the val register.
96
00:10:18,140 --> 00:10:24,770
So you walk the interpreter over the predicate and make sure that the result would go into the val register.
97
00:10:24,770 --> 00:10:38,670
And then you compile an instruction that says, branch if, if val is true, to a place we'll call label one.
98
00:10:44,950 --> 00:11:04,920
Then we, we will put the code for B to walk the interpreter--walk the interpreter over B. And then go to put in an instruction that says, go to the next thing, whatever, whatever was supposed to happen after this thing was done.
99
00:11:04,920 --> 00:11:06,900
You put in that instruction.
100
00:11:06,900 --> 00:11:08,280
And here you put label one.
101
00:11:11,521 --> 00:11:25,870
And here you put the code for A. And you put go to next thing.
102
00:11:31,420 --> 00:11:33,090
So that's how you treat a conditional.
103
00:11:33,090 --> 00:11:35,890
You generate a little block like that.
104
00:11:35,890 --> 00:11:42,310
And other than that, this zeroth-order compiler is the same as the evaluator.
105
00:11:42,310 --> 00:11:46,380
It's just stashing away the instructions instead of executing them.
106
00:11:46,380 --> 00:11:50,120
That seems pretty simple, but we've gained something by that.
107
00:11:50,120 --> 00:11:53,630
See, already that's going to be more efficient than the evaluator.
108
00:11:53,630 --> 00:12:04,740
Because, if you watch the evaluator run, it's not only generating the register operations we wrote down, it's also doing things to decide which ones to generate.
109
00:12:04,740 --> 00:12:16,780
So the very first thing it does, say, here for instance, is go do some tests and decide that this is an application, and then branch off to the place that, that handles applications.
110
00:12:16,780 --> 00:12:25,580
In other words, what the evaluator's doing is simultaneously analyzing the code to see what to do, and running these operations.
111
00:12:25,580 --> 00:12:34,900
And when you-- if you run the evaluator a million times, that analysis phase happens a million times, whereas in the compiler, it's happened once, and then you just have the register operations themselves.
112
00:12:39,730 --> 00:12:44,550
Ok, that's a, a zeroth-order compiler, but it is a wretched, wretched compiler.
113
00:12:44,550 --> 00:12:47,200
It's really dumb.
114
00:12:47,200 --> 00:12:52,040
Let's--let's go back and, and look at this overhead.
115
00:12:52,040 --> 00:12:56,020
So look at look at some of the operations this thing is doing.
116
00:12:56,020 --> 00:13:03,710
We're supposedly looking at the operations and interpreting f of x.
117
00:13:03,710 --> 00:13:05,220
Now, look here what it's doing.
118
00:13:05,220 --> 00:13:13,850
For example, here it assigns to exp the operator in fetch of exp.
119
00:13:13,850 --> 00:13:23,310
But see, there's no reason to do that, because this is-- the compiler knows that the operator, fetch of exp, is f right here.
120
00:13:23,310 --> 00:13:25,850
So there's no reason why this instruction should say that.
121
00:13:25,850 --> 00:13:29,580
It should say, we'll assign to exp, f.
122
00:13:29,580 --> 00:13:32,000
Or in fact, you don't need exp at all.
123
00:13:32,000 --> 00:13:33,670
There's no reason it should have exp at all.
124
00:13:33,670 --> 00:13:35,170
What, what did exp get used for?
125
00:13:35,170 --> 00:13:48,620
Well, if we come down here, we're going to assign to val, look up the stuff in exp in the environment.
126
00:13:48,620 --> 00:13:58,850
So what we really should do is get rid of the exp register altogether, and just change this instruction to say, assign to val, look up the variable value of the symbol f in the environment.
127
00:14:01,100 --> 00:14:09,150
Similarly, back up here, we don't need unev at all, because we know what the operands of fetch of exp are for this piece of code.
128
00:14:09,150 --> 00:14:10,630
It's the, it's the list x.
129
00:14:13,270 --> 00:14:19,660
So in some sense, you don't want unev and exp at all.
130
00:14:19,660 --> 00:14:25,230
See, what they really are in some sense, those aren't registers of the actual machine that's supposed to run.
131
00:14:25,230 --> 00:14:30,760
Those are registers that have to do with arranging the thing that can simulate that machine.
132
00:14:30,760 --> 00:14:39,510
So they're always going to hold expressions which, from the compiler's point of view, are just constants, so can be put right into the code.
133
00:14:39,510 --> 00:14:44,000
So you can forget about all the operations worrying about exp and unev and just use those constants.
134
00:14:44,000 --> 00:14:50,510
Similarly, again, if we go, go back and look here, there are things like assign to continue eval-args.
135
00:14:53,890 --> 00:14:55,440
Now, that has nothing to do with anything.
136
00:14:55,440 --> 00:15:06,920
That was just the evaluator keeping track of where it should go next, to evaluate the arguments in some, in some application.
137
00:15:06,920 --> 00:15:15,220
But of course, that's irrelevant to the compiler, because you-- the analysis phase will have already done that.
138
00:15:15,220 --> 00:15:17,680
So this is completely irrelevant.
139
00:15:17,680 --> 00:15:26,120
So a lot of these, these assignments to continue have not to do where the running machine is supposed to continue in keeping track of its state.
140
00:15:26,120 --> 00:15:30,080
It has to, to do with where the evaluator analysis should continue, and those are completely irrelevant.
141
00:15:30,080 --> 00:15:31,330
So we can get rid of them.
142
00:15:44,330 --> 00:16:08,540
Ok, well, if we, if we simply do that, make those kinds of optimizations, get rid, get rid of worrying about exp and unev, and get rid of these irrelevant register assignments to continue, then we can take this literal code, these sort of 19 instructions that the, that the evaluator would have done, and then replace them.
143
00:16:08,540 --> 00:16:09,865
Let's look at the, at the slide.
144
00:16:13,490 --> 00:16:15,180
Replace them by--we get rid of about half of them.
145
00:16:18,370 --> 00:16:25,200
And again, this is just sort of filtering what the evaluator would have done by getting rid of the irrelevant stuff.
146
00:16:25,200 --> 00:16:35,470
And you see, for instance, here the--where the evaluator said, assign val, look up variable value, fetch of exp, here we have put in the constant f.
147
00:16:35,470 --> 00:16:37,020
Here we've put in the constant x.
148
00:16:39,770 --> 00:16:43,860
So there's a, there's a little better compiler.
149
00:16:43,860 --> 00:16:47,930
It's still pretty dumb.
150
00:16:47,930 --> 00:16:50,560
It's still doing a lot of dumb things.
151
00:16:50,560 --> 00:17:03,430
Again, if we go look at the slide again, look at the very beginning here, we see a save the environment, assign something to the val register, and restore the environment.
152
00:17:03,430 --> 00:17:05,030
Where'd that come from?
153
00:17:05,030 --> 00:17:11,160
That came from the evaluator back here saying, oh, I'm in the middle of evaluating an application.
154
00:17:11,160 --> 00:17:15,940
So I'm going to recursively call eval dispatch.
155
00:17:15,940 --> 00:17:19,849
So I'd better save the thing I'm going to need later, which is the environment.
156
00:17:19,849 --> 00:17:23,520
This was the result of recursively calling eval dispatch.
157
00:17:23,520 --> 00:17:26,540
It was evaluating the symbol f in that case.
158
00:17:26,540 --> 00:17:31,380
Then it came back from eval dispatch, restored the environment.
159
00:17:31,380 --> 00:17:38,740
But in fact, the actual thing it ended up doing in the evaluation is not going to hurt the environment at all.
160
00:17:38,740 --> 00:17:42,170
So there's no reason to be saving the environment and restoring the environment here.
161
00:17:46,020 --> 00:17:58,090
Similarly, here I'm saving the argument list. That's a piece of the argument evaluation loop, saving the argument list, and here you restore it.
162
00:17:58,090 --> 00:18:04,090
But the actual thing that you ended up doing didn't trash the argument list. So there was no reason to save it.
163
00:18:08,690 --> 00:18:23,180
So another way to say, another way to say that is that the, the evaluator has to be maximally pessimistic, because as far from its point of view it's just going off to evaluate something.
164
00:18:23,180 --> 00:18:26,200
So it better save what it's going to need later.
165
00:18:26,200 --> 00:18:32,140
But once you've done the analysis, the compiler is in a position to say, well, what actually did I need to save?
166
00:18:32,140 --> 00:18:39,950
And doesn't need to do any-- it doesn't need to be as careful as the evaluator, because it knows what it actually needs.
167
00:18:39,950 --> 00:18:49,400
Well, in any case, if we do that and eliminate all those redundant saves and restores, then we can get it down to this.
168
00:18:49,400 --> 00:19:00,070
And you see there are actually only three instructions that we actually need, down from the initial 11 or so, or the initial 20 or so in the original one.
169
00:19:00,070 --> 00:19:04,870
And that's just saying, of those register operations, which ones did we actually need?
170
00:19:09,490 --> 00:19:13,450
Let me just sort of summarize that in another way, just to show you in a little better picture.
171
00:19:16,010 --> 00:19:20,530
Here's a picture of starting-- This is looking at all the saves and restores.
172
00:19:23,770 --> 00:19:38,160
So here's the expression, f of x, and then this traces through, on the bottom here, the various places in the evaluator that were passed when the evaluation happened.
173
00:19:38,160 --> 00:19:40,250
And then here, here you see arrows.
174
00:19:40,250 --> 00:19:42,320
Arrow down means register saved.
175
00:19:42,320 --> 00:19:46,860
So the first thing that happened is the environment got saved.
176
00:19:46,860 --> 00:19:48,305
And over here, the environment got restored.
177
00:19:52,380 --> 00:19:56,220
And these-- so there are all the pairs of stack operations.
178
00:19:56,220 --> 00:20:03,320
Now, if you go ahead and say, well, let's remember that we don't--that unev, for instance, is a completely useless register.
179
00:20:07,550 --> 00:20:13,020
And if we use the constant structure of the code, well, we don't need, we don't need to save unev. We don't need unev at all.
180
00:20:16,220 --> 00:20:23,860
And then, depending on how we set up the discipline of the--of calling other things that apply, we may or may not need to save continue.
181
00:20:27,360 --> 00:20:28,800
That's the first step I did.
182
00:20:28,800 --> 00:20:32,960
And then we can look and see what's actually, what's actually needed.
183
00:20:32,960 --> 00:20:40,040
See, we don't-- didn't really need to save env or cross-evaluating f, because it wouldn't, it wouldn't trash it.
184
00:20:40,040 --> 00:21:03,320
So if we take advantage of that, and see the evaluation of f here, doesn't really need to worry about, about hurting env. And similarly, the evaluation of x here, when the evaluator did that it said, oh, I'd better preserve the function register around that, because I might need it later.
185
00:21:03,320 --> 00:21:07,140
And I better preserve the argument list.
186
00:21:07,140 --> 00:21:12,730
Whereas the compiler is now in a position to know, well, we didn't really need to save-- to do those saves and restores.
187
00:21:12,730 --> 00:21:19,670
So in fact, all of the stack operations done by the evaluator turned out to be unnecessary or overly pessimistic.
188
00:21:19,670 --> 00:21:21,390
And the compiler is in a position to know that.
189
00:21:27,470 --> 00:21:29,980
Well that's the basic idea.
190
00:21:29,980 --> 00:21:40,460
We take the evaluator, we eliminate the things that you don't need, that in some sense have nothing to do with the compiler at all, just the evaluator, and then you see which stack operations are unnecessary.
191
00:21:40,460 --> 00:21:45,130
That's the basic structure of the compiler that's described in the book.
192
00:21:45,130 --> 00:21:51,280
Let me just show you how that examples a little bit too simple.
193
00:21:51,280 --> 00:21:55,765
To see how you, how you actually save a lot, let's look at a little bit more complicated expression.
194
00:21:58,330 --> 00:22:03,542
F of G of X and 1.
195
00:22:03,542 --> 00:22:06,410
And I'm not going to go through all the code.
196
00:22:06,410 --> 00:22:09,830
There's a, there's a fair pile of it.
197
00:22:09,830 --> 00:22:17,270
I think there are, there are something like 16 pairs of register saves and restores as the evaluator walks through that.
198
00:22:17,270 --> 00:22:20,680
Here's a diagram of them.
199
00:22:20,680 --> 00:22:21,060
Let's see.
200
00:22:21,060 --> 00:22:24,210
You see what's going on.
201
00:22:24,210 --> 00:22:26,480
You start out by--the evaluator says, oh, I'm about to do an application.
202
00:22:26,480 --> 00:22:28,010
I'll preserve the environment.
203
00:22:28,010 --> 00:22:30,261
I'll restore it here.
204
00:22:30,261 --> 00:22:33,900
Then I'm about to do the first operand.
205
00:22:36,790 --> 00:22:38,970
Here it recursively goes to the evaluator.
206
00:22:38,970 --> 00:22:46,740
The evaluator says, oh, this is an application, I'll save the environment, do the operator of that combination, restore it here.
207
00:22:46,740 --> 00:22:51,720
This save--this restore matches that save. And so on.
208
00:22:51,720 --> 00:22:57,240
There's unev here, which turns out to be completely unnecessary, continues getting bumped around here.
209
00:22:57,240 --> 00:23:05,330
The function register is getting, getting saved across the first operands, across the operands.
210
00:23:05,330 --> 00:23:06,680
All sorts of things are going on.
211
00:23:06,680 --> 00:23:14,320
But if you say, well, what of those really were the business of the compiler as opposed to the evaluator, you get rid of a whole bunch.
212
00:23:14,320 --> 00:23:34,570
And then on top of that, if you say things like, the evaluation of F doesn't hurt the environment register, or simply looking up the symbol X, you don't have to protect the function register against that.
213
00:23:34,570 --> 00:23:37,530
So you come down to just a couple of, a couple of pairs here.
214
00:23:40,280 --> 00:23:42,160
And still, you can do a little better.
215
00:23:42,160 --> 00:23:44,962
Look what's going on here with the environment register.
216
00:23:44,962 --> 00:23:52,600
The environment register comes along and says, oh, here's a combination.
217
00:23:54,280 --> 00:23:58,580
This evaluator, by the way, doesn't know anything about G.
218
00:23:58,580 --> 00:24:15,540
So here it says, so it says, I'd better save the environment register, because evaluating G might be some arbitrary piece of code that would trash it, and I'm going to need it later, after this argument, for doing the second argument.
219
00:24:15,540 --> 00:24:22,550
So that's why this one didn't go away, because the compiler made no assumptions about what G would do.
220
00:24:22,550 --> 00:24:27,710
On the other hand, if you look at what the second argument is, that's just looking up one.
221
00:24:27,710 --> 00:24:30,810
That doesn't need this environment register.
222
00:24:30,810 --> 00:24:32,070
So there's no reason to save it.
223
00:24:32,070 --> 00:24:35,020
So in fact, you can get rid of that one, too.
224
00:24:35,020 --> 00:24:45,170
And from this whole pile of, of register operations, if you simply do a little bit of reasoning like that, you get down to, I think, just two pairs of saves and restores.
225
00:24:45,170 --> 00:24:56,650
And those, in fact, could go away further if you, if you knew something about G.
226
00:24:56,650 --> 00:25:03,310
So again, the general idea is that the reason the compiler can be better is that the interpreter doesn't know what it's about to encounter.
227
00:25:03,310 --> 00:25:07,750
It has to be maximally pessimistic in saving things to protect itself.
228
00:25:07,750 --> 00:25:13,410
The compiler only has to deal with what actually had to be saved.
229
00:25:13,410 --> 00:25:17,920
And there are two reasons that something might not have to be saved.
230
00:25:17,920 --> 00:25:24,210
One is that what you're protecting it against, in fact, didn't trash the register, like it was just a variable look-up.
231
00:25:24,210 --> 00:25:30,800
And the other one is, that the thing that you were saving it for might turn out not to actually need it.
232
00:25:30,800 --> 00:25:38,260
So those are the two basic pieces of knowledge that the compiler can take advantage of in making the code more efficient.
233
00:25:44,570 --> 00:25:45,820
Let's break for questions.
234
00:25:51,280 --> 00:25:56,350
AUDIENCE: You kept saying that the uneval register, unev register didn't need to be used at all.
235
00:25:56,350 --> 00:25:58,590
Does that mean that you could just map a six-register machine?
236
00:25:58,590 --> 00:26:01,860
Or is that, in this particular example, it didn't need to be used?
237
00:26:01,860 --> 00:26:07,580
PROFESSOR: For the compiler, you could generate code for the six-register, five, right?
238
00:26:07,580 --> 00:26:08,930
Because that exp goes away also.
239
00:26:11,750 --> 00:26:17,380
Assuming--yeah, you can get rid of both exp and unev, because, see, those are data structures of the evaluator.
240
00:26:17,380 --> 00:26:21,410
Those are all things that would be constants from the point of view of the compiler.
241
00:26:21,410 --> 00:26:29,330
The only thing is this particular compiler is set up so that interpreted code and compiled code can coexist.
242
00:26:29,330 --> 00:26:39,920
So the way to think about it is, is maybe you build a chip which is the evaluator, and what the compiler might do is generate code for that chip.
243
00:26:39,920 --> 00:26:41,550
It just wouldn't use two of the registers.
244
00:26:51,158 --> 00:26:53,326
All right, let's take a break.
245
00:26:53,326 --> 00:27:28,576
[MUSIC PLAYING]
246
00:27:28,576 --> 00:27:32,900
We just looked at what the compiler is supposed to do.
247
00:27:32,900 --> 00:27:38,120
Now let's very briefly look at how, how this gets accomplished.
248
00:27:38,120 --> 00:27:39,600
And I'm going to give no details.
249
00:27:39,600 --> 00:27:43,440
There's, there's a giant pile of code in the book that gives all the details.
250
00:27:43,440 --> 00:27:49,590
But what I want to do is just show you the, the essential idea here.