Turbo Pascal back in the day was super fast. One of the fastest compiled languages I had used before we had incremental compilers and background incremental compilation. C/C++ compilers have mostly been a lot slower than those old Turbo Pascal implementations. MS Quick C 1.1 was a notable exception in the C compiler space. It could do 1 pass in memory compilation of most c code. Modern frameworks feel way slower even with scripting languages that require no compiles.
Coming to think of it, my 486 from early 90s when running DOS was one of the fastest computers I have used when it comes to bootup and applicaion load times. I was like type name of program, press enter and the program is up and ready to use. The systems that I had before that did take a lot longer to launch programs and anything with Windows has always felt an order of a magnitude slower. Modern tiny linux disros can match that speed, but then that is only with a very slimmed down system minus any gui. The MacBook Pro with M1 Pro did feel a lot faster than previous macs or windows machines when it came to application launch times and general gui responsiveness, but still no match to those DOS systems.
My experience as well. Apparently the Turbo Pascal developer implemented the compiler as a one pass compiler generating machine code while parsing. It dramatically reduces memory usage and increase speed, while reducing the opportunity to optimize the machine code at a higher level.
Back in 2005, I bought a car in India. I was trying to pay a part of the purchase price using my credit card. My auto dealer wanted me to add a surcharge. When I called up my card company and complained about this, i was asked to pay the surcharge by my card issuer's call center employee :-(
I used to live in New Delhi, India. In past there were frequent power outages and as a result almost everyone who could afford got either a generator or a battery based power backup solution. Not everyone can afford these, but whoever can would get the backup setup in place.
There exists a similar situation with water supply. What started out as 24 hours running water back in 1970s has ended up as supply at low pressure for < 1 hour once or twice a day. As a result everyone has built additional storage water tanks and added in line pumps to suck water when the supply is present. The in line pumps are illegal, but everyone has one.
getting back to topic, India also has the government frequently shut down mobile and broadband internet to control communal riots and anti-government protests.
Reminds me of a case where I was trying to help debug some legacy code. I had a simple question, what code gets executed when this method is triggered via the network request. And the final answer that i got from the developer maintaining that code was: I have been working on this for the last 9 months and I have not yet been able to find he entry point from where the requests are handled
I would hope that, in future, some chatgpt like tool would be able to answer basic questions like this and help a new developer start getting productive in much shorter time than now.
Then imagine asking that tool to summarise all the business logic applied in the project or ask questions abou which all sections of the code may need investigation to analyze a bug and come up with a solution.
In the end, I would think of these ai tools as something that can function like another junior or even slightly experienced developer in the team.
I have been in enough debugging sessions to know that it is easy to weed out those who are bad from those who have the basics right. But trying to know if a candidate is good with advanced debugging is a futile exercise. You need to be in a few intense sessions with such people to know who are good/bad at these things.
I have only been able to figure this about people after working with them for 6-12 months. Which is why it helps keep references to such people and have them on your teams in future ;-)
And what if the firewall blocks icmp? What if the dns server is internal to the network and is returning a stale ip? There are way too paths down this rabbit hole.
This is exactly my point. It's even worse in a "school test" situation, because who knows what contrived scenario the instructor has invented depending on what their focus is (eg: DNS vs physical net vs routing vs apache config).
Maybe there are multiple DNS servers for the zone that are returning different IPs, maybe one or both of them is even in split-zone configuration so it returns a different IP depending on if you're internal or external. Maybe the client has manual DNS configured or a hosts entry that's wrong.
Each of these problems would have several more layers of troubleshooting steps and branching, and it's not even a complete list -- and this is only if the problem is DNS-centric! There's hundreds of other branches for each of the other problem categories.
I mean if it is a test you obviously test things that you tought them first, or not?
Of course in reality there can be more, weirder things — especially if you are coming into an unknown network. But we are talking about an educational context here, would not make a lot of sense to let your students run into new unknown issues on a test unless your goal is not to educate.
One of the problems with debugging is that everyone seems to approach it differently. With some of the approaches working better than others in some situations. Which is why a group debugging session works great when you have a tricky problem that is not getting solved by one person.
We can definitely test for basic debugging and troubleshooting skills. But I havent found a way to consistently evaluate people who are capable of identifying and finding solutions to complex problems. These days a lot of these come down to framework level experience. With the proliferation of frameworks and tools used in modern apps, it is impossible to find someone who can solve problems involving all of them. So in a big team you want a variety of such experiences to cover a wider base.
Having said that, I have been in many situations where i have had to join a debugging session involving technologies or programming languages where i have had zero prior knowledge and have moved things towards a solution by asking what at times seems like basic queries to help others come up with the solution.
When push comes to shove, go down to the most the basic of debugger tools. Print output to a tracing/console. I have had to resort to some version of temporary print statements in my code to get thru with debugging. And along the way have found many situations where those print statements or the logging/tracing equivalents introduced changes that altered the program's behavior.
I remember finding a situation where output of console.log would not match reality when debugging using chrome's debugger while working on a react app in 2019. Had to resort to making a copy of the variables to get console.log working in that situation.
When it comes to web deveopment, like others have said, it is easy to debug modern web apps with the help of source maps. We have been doing some version of that for a long time now. I remember using source maps with codeview back in the late 80s/early 90s.
I did a 7 day fast, and I mostly ended up losing fat. While I did not do any scientific measurement, i was able to exercise the same post the 7 day fast and my belly fat came down big time.
Maybe body builders lose lean body mass when they fast, but it does not seem to hold true for the average human.
How would you know, without the "expensive" densitometry scan?
My anecdotal experience: I lost weight via alternate day fasting and did indeed shell out the 120 euro that two scans cost me. I lost mainly fat, my lean body mass after 6 months was down 4 pct, within error range I'd assume. I worked out normally through the entire period and even did things like a 200km bike trip during a 72hr fast. (I did a few of those to compensate for not fasting on holidays). Great experience, never felt better tbh.
One thing I think I learned was the exact threshold of power that allowed me to keep burning fat. Its hard to describe the feeling but after doing this for months (over a year by now) I think I know exactly when I can still push and continue on fat, and when I need to slow down to prevent lbm loss. Doing sprints, short intervals etc. would likely cause me to break proteins in the body and indeed be counterproductive - but here more research is probably needed, and individuals will likely have different thresholds.
If you can't get a DEXA scan there is an equation published in December of 2021 which approximates it well:
The new equation [%BFNew = 6.083 + (0.143 × SSnew) - (12.058 × sex) - (0.150 × age) - (0.233 × body mass index) + (0.256 × waist) + (0.162 × sex × age)] explained a significant proportion of variance in %BF5C (R2 = 0.775, SEE = 4.0%). Predictors included sum of skinfolds (SSnew, midaxillary, triceps, and thigh) and waist circumference. The new equation cross-validated well against %BF5C when compared with other existing equations, producing a large intraclass correlation coefficient (0.90), small mean bias and limits of agreement (0.4% ± 8.6%), and small measures of error (SEE = 2.5%).
Generalized Equations for Predicting Percent Body Fat from Anthropometric Measures Using a Criterion Five-Compartment Model
Zackary S Cicone, Brett S Nickerson 1, Youn-Jeng Choi 2, Clifton J Holmes 3, Bjoern Hornikel 4, Michael V Fedewa 4, Michael R Esco 4
Affiliations expand
PMID: 34310492 PMCID: PMC8785250 (available on 2022-12-01) DOI: 10.1249/MSS.0000000000002754
>Maybe body builders lose lean body mass when they fast
Well, highly fit people certainly lose lean body mass when they fast, which you're probably not (even if you're fairly fit)
And bodybuilders DON'T lose fat when they fast because they take supraphysiological doses of hormones (steroids) that prevent the body losing muscle - and usually eat protein throughout their fast too. So not a good example.
Their whole argument was around body builders not losing fat when fasting, so if you're right, this is just irrelevant. No one expects people to lose fat when they're not caloric restricted.
Coming to think of it, my 486 from early 90s when running DOS was one of the fastest computers I have used when it comes to bootup and applicaion load times. I was like type name of program, press enter and the program is up and ready to use. The systems that I had before that did take a lot longer to launch programs and anything with Windows has always felt an order of a magnitude slower. Modern tiny linux disros can match that speed, but then that is only with a very slimmed down system minus any gui. The MacBook Pro with M1 Pro did feel a lot faster than previous macs or windows machines when it came to application launch times and general gui responsiveness, but still no match to those DOS systems.