There’s enough satellites in Sun-synchronous orbit (97-ish degree inclination) that polar coverage should be pretty good by now, I’d imagine. The gap from the big guns (GEO and MEO) is more than made up by LEO.
I’m pretty sure that the state of the art right now is firing the pastries on a ballistic arc in hard vacuum and hitting them mid-trajectory with a laser pulse to cook them through.
Pretty much everyone is on 300 mm wafers for everything now, and has been for a while. Are you perhaps reading this as 300
nm process (which would usually be called 0.3 micron)?
But in the context of what we are talking about it's still true that nobody in the EU is making cutting edge CPU/GPU/DRAM and there are no plans to do so either (including that Infineon fab).
I think a challenge to me for typing assembly, unless you’re doing old-school C style minimally-useful types, is that assembly types tend to be both more ad hoc and more transient than types in higher level languages, because these types come from the intersection of the problem domain and the way of expressing the solution, instead of just from the problem domain. In C++ I might have a type for “aircraft velocity in mm/s”, but in assembly I might have that type on one line, and then go to velocity in 2x mm/s the next line to save a renormalization; or have types for various state flags, but have them pack differently into a word in different places in the code. This is all expressible, but I think it would make me favor a more implicit typing with a heavier emphasis on deduction, just to minimize the description of types that exist but are not in themselves interesting.
Just thinking about an aircraft's velocity as a specific type, rather than a vector with three floats, has my mind whirling. I can imagine a lot of terrifying things I wish I didn't think could be added later to that struct in some avionics system. What would you need a type for that for? Am I thinking too high level, where this type might include its own getters and function calls?
Think of types more as physical units to check your calculation. The position on a chess board and on a checker board are both 2d integer vectors but you might or might not want them able to be summed together, the same way that 5 liters and 5 grams are both real numbers but should not be summed.
So if your algorithm counts apples and counts pears, those wouldn't both have the type "integer". Far from it. They would have the types "number of apples" and "number of pears".
See the other replies — think physical units. An aircraft velocity is a three-vector, but not all three vectors are aircraft velocity. There are probably many different aircraft velocity types, but taking a typical one (NED alignment, mm/s scaling, some particular precision), the type is the set of three vectors that can, for example, be meaningfully added to each other. It makes sense to add two aircraft velocities; it does not make sense to add an aircraft velocity and a pixel color (another three vector), so they are observably different types.
Because due to rotation the earth bulges outward at the equator. The actual term is "Equatorial bulge".
Due to that "equatorial bulge" you are further away from earths dense center, than at the equator. And due to the inverse square law, you lose intensity of that gravitational force very quickly, the further you are away.
Hence: the smallest gravitational force you get is on a mountain in the Andes near the equator.
Which Boeing incident? The 737 Max was a correct implementation of bad requirements -- there's no indication of a code quality problem here. Starliner definitely had more indications of code issues, but was not an aircraft.
> Objective-C was created because using standards is contrary to the company culture.
What language would you have suggested for that mission and that era? Self or Smalltalk and give up on performance on 25-MHz-class processors? C or Pascal and give up an excellent object system with dynamic dispatch?
C's a great language in 1985, and a great starting point. But development of UI software is one of those areas where object oriented software really shines. What if we could get all the advantages of C as a procedural language, but graft on top an extremely lightweight object system with a spec of < 20 pages to take advantage of these new 1980s-era developments in software engineering, while keeping 100% of the maturity and performance of the C ecosystem? We could call it Objective-C.
If you read the colorforth source (there’s only around 1000 instructions of assembly, it’s not a long read), there’s a real sense that much of the design (pretokenization, color tags, etc) are built around the punchline of a single (with rep prefix) instruction being used to linear-search the dictionary — going to trivial hardware-supported data structures here constrains the implementation a bunch, but there’s magic in self-imposed constraints.
Slowing refers to a change in the derivative, in this context. Slowing growth would be a decrease in change in GDP per year — a decrease in growth. But the claim is that the growth (first derivative) is what’s slowing — that is, the second derivative of GDP w.r.t. time went negative, which does seem to be the case in mid 2018 from the linked chart.
reply