If you have the means, get a computer science degree at a reasonable school, and don't listen to people telling it's too late, and that ML is your meal ticket. We have two kinds of jobs here in the valley, the glamorous and competitive, and the really challenging and necessary. The latter are more immune to hype cycles and economic downturns. If you do something along the lines of networking/security, cloud infrastructure, and learn something that confuses other people, say, how to use OAuth2 properly as an example, you will be able to work on the infra side in almost any company. Once you get your foot in the door, then you can learn the latest hottest fad wherever it is that you are, on the job. Infrastructure is the computer industry version of cleaning out the stables for the horses, but it's also necessary everywhere and once you prove yourself and show you are capable of learning on the job, you can work on other stuff. If you start to enjoy infra stuff, you can work anywhere. Don't get stuck doing any kind of process though, don't be the compliance guy, for example.
The biggest shortage in silicon valley is that of capable minds and hands. It's on you to make yourself marketable, but once that's done, there's tons of opportunity. I've been working here since 1996. I'm a grey haired old timer now, and I've seen this industry from big companies, startups, boring companies, and fad companies. Get your foot in the door, the first job won't be glorious, but as you demonstrate skill, pay and rank will follow. Don't go to any of the companies staffed by lots of startup bros, because a wrinkle or some greying hair is a disadvantage there, but there are plenty of other places to start.
This, but do your CS degree online and also do a MS rather than an undergrad. Despite what you might think you need essentially no prior coding experience to enroll and complete most MS programs (though you may be conditionally admitted and need remedials). And any school will do for MS, there is no such thing as a prestigious masters degree. Anyone hunting academic prestige beyond undergrad just gets a doctorate.
Not sure what a CS degree is supposed to help with if this guy wants to pursue the "scrappy SV tech startup" world.
He shouldn't be wasting time learning from professors who have never done anything real and have lived in an academic insulated bubble for decades.
Instead he should be building and shipping product after product until something sticks. And spending just as much time on marketing as development, and putting a lot of time and energy into hiring out and training a team to scale up to the next stage.
The degree itself is unimportant, but there are important skills that a CS program teaches, which you can teach yourself, but that degree makes getting interviews a lot easier. Like I said, I've been doing this for a very long time, and worked with many hiring processes.A person without either experience or a recognized degree has a very difficult time getting their foot in the door. I may have misread and thought that the OP was interested in engineering.
For PM's, or management, the skill set and showing that you have it will be different.
I've been running pgBouncer in large production systems for years (~10k connections to pgbouncer per DB, and 200-500 active connections to postgres). We have so many connections because microservices breed like rabbits in spring once developers make the first one, but I could rant about that in a different post.
We use transaction level sharing. Practically, this means we occasionally see problems when some per-connection state "leaks" from one client to another when someone issues a SQL statement that affects global connection state, and it affects the query of a subsequent client inheriting that state. It's annoying to track down, but given the understanding of behavior, developers generally know how to limit their queries at this point. Some queries aren't appropriate for going through pgbouncer, like cursor based queries, so we just connect directly to the DB for the rare cases where this is needed.
Why so many connections? Say you make a Go based service, which launches one goroutine per request, and your API handlers talk to the DB - the way the sql.dB connection pooling works in Go is that it'll grow its own pool to be large enough to satisfy the working parallelism, and it doesn't yield them for a while. Similar things happen in Java, Scala, etc, and with dozens of services replicated across multiple failure domains, you get a lot of connections.
It's a great tool. It allows you to provision smaller databases and save cost, at the cost of some complexity.
> microservices breed like rabbits in spring once developers make the first one
microservices talking to the same db... thats not microservices thats a disaster. you basically combine the negatives of the microservice world with the negatives of the monolith - tight coupling.
Databases are there to share data and provide transactional guarantees and even locking. Your data often must be tightly coupled like this, and most databases designed with this in mind and provide benefits when doing so. It doesn't mean your apps need to be, and there are still plenty of benefits in deployment and operations to be had with microservices. Silo the data when it makes sense, but force the issue you end up with a different problem trying to reimplement the benefits of a database in the app layer or with a fault tolerant, guaranteed delivery messaging system (itself a database under the hood).
> If you make over $66k, your income tax is 9.3%, going up to 13.3% if you make FANG money.
The marginal rate is 9.3% above 66k. If your AGI is $67k your taxes are ~$3000. There is no 13.3% bracket. There is a 12.3% bracket for marginal income above $677k, which fewer than 1% of Californians earn.
You pay surcharges not included in the base tax rate as income goes up. There's a 1% surcharge over $1M income that's from a passed proposition. I've never hit it, doubt that I will, but it's there.
In the US, you pay state tax and federal tax separately, so those are only the state taxes. If you're paying the maximum marginal rate in CA, you're also paying 37% federal rate and a 3.8% surcharge on investment income on top of that.
You also have to remember that in CA the assessed value of your home (on which your property tax is based) will never go up more than 2% per year. So while the property tax rate may be higher than some other places, if you stay in your home for more than a few years, the actual amount of property tax you pay will quickly be lower than it would be elsewhere.
Why is 350ppm the perfect concentration? Higher is better for plants, because most are carbon limited, for example. The Carboniferous era, whose biosphere sequestered so much carbon, had far higher concentrations.
I’m not disagreeing, but that seemed like a statement out of the blue.
Some people have an aversion to "goto" statements, and Java annotations are even worse, they're a "comefrom" statement, where you end up executing code before or after your function based on this annotation, so it makes the code really annoying to follow.
Java examples which show trivial code annotated with @POST or @Path are not representative of production systems, where you may have a lot more annotations for your DOM, documentation, and in some cases, you actually have more annotation boilerplate than you have code in the handler/controller.
Having annotations interleaved within your logic makes it difficult to provide good API documentation, and it's hard to automatically refactor, because your boilerplate is interleaved with real code. With an approach like this grumpyrest, you can put all your machine generated code into a package, and simply connect it to your hand written code with a little bit of glue. It makes spec-driven development much easier.
OpenAPI is very popular, and annotation based frameworks make it more difficult to integrate with it. If you generate API docs automatically from code, as with JAX-RS, it's easy to break things by accident because nobody audits machine generated docs. If you reverse the approach, and do spec-driven development, you code review the API behavior, and the code follows, which is a better model, in my opinion. Grumpyrest looks like it makes integration with spec-driven workflows quite easy.
A word of caution to the author; if this takes off, you will be inundated with issues and PR's, since people will use this in ways you never dreamed of. I'm experiencing that kind of onslaught in something I open-sourced for Go for REST API's.
> Some people have an aversion to "goto" statements, and Java annotations are even worse, they're a "comefrom" statement, where you end up executing code before or after your function based on this annotation, so it makes the code really annoying to follow.
I like this comment so much because I would have described annotations as a "comefrom" myself... but then I probably read that somewhere and forgot about it.
> Grumpyrest looks like it makes integration with spec-driven workflows quite easy.
This is interesting to me because I always thought about it the code-first-generate-documentation way, and I always wondered if/how I can derive all the meta-data from the code which, admitted, annotations make much easier because they are statically accessible.
Doing it the spec-first way is something I should definitely consider.
> A word of caution to the author; if this takes off, you will be inundated with issues and PR's, since people will use this in ways you never dreamed of. I'm experiencing that kind of onslaught in something I open-sourced for Go for REST API's.
Thank you for the warning. Do you have any advice on how to prepare for that?
> Some people have an aversion to "goto" statements
That's because 1 of the 1st things they teach at school is that "goto" is bad, but no 1 really gets why (at the time). It just goes into their brains and this carries forward.
> Java examples which show trivial code annotated with @POST or @Path are not representative of production systems
That happens with every language though. There are disasters everywhere.
Regarding api docs, my experiences have steered me to generate-docs-from-code. The problem with writing api docs first and generate code after, is that writing api docs is a 'business' collaboration, which eventually is not done by developers. The code is a 'technical' model of that api contract. The developers are limited by the api-to-code generator and even more importantly, the limitations of the coding language.
As a result, the 'business' comes up with the api contract, discussed and approved by multiple departments and committees, before the 'technical' developers have to show it is impossible to implement with current technology.
I'm biased, because I did work with IRIX at SGI and I did in fact touch some kernel code as well.
What I miss from IRIX, that no other system has yet replicated:
1) Realtime mode. RTLinux doesn't count. In IRIX, you could run your own code at a higher priority than the scheduler itself (this was called hard realtime), and you'd give it time slices when you could, or a core or two.
2) Frame scheduling. When rendering 3D, you could have the scheduler connected to the monitor refresh rate to make you less likely to miss a frame boundary.
Yes, Motif was very plain, and X11 was a hot mess to work with, but the thing had capabilities which are still hard to find.
> What I miss from IRIX, that no other system has yet replicated: 1) Realtime mode. RTLinux doesn't count.
FWIW, SGI didn't seem to agree.
From SGI's own whitepaper: "In addition, REACT for Linux adds unique capabilities including sgi-shield and kbar that were not available on IRIX. The Linux based platform delivers better real-time performance than SGI
Origin running IRIX with realtime extensions: 30µs guaranteed interrupt response time versus 50µs for Origin."
Without REACT extensions, Irix realtime facilities aren't any different than the scheduling policies of Linux (this is akin to bypassing the normal scheduler).
I have a soft spot for Irix from the early 90s, and it had some clever accomodations for the technology at the time, but things have moved on and advanced.
I have some hard numbers on this in my own health. I'm a pasty north European living in California and I get plenty of sunshine, as I enjoy the outdoors. As I got older, I got increasingly more viral infections and strep throat was a twice a year affair. It didn't occur to me that I could be vitamin-D deficient, but I was, severely. For some reason, 10+ hours of sunshine a week isn't enough for me to make my own.
Over months, I tracked my blood vitamin D levels as I took supplements, and now I know that the 2500IU dose is enough to keep me at nominal levels - the higher doses cause it to build up too much and would eventually cause liver issues.
I know it's anecdotal, but in the few years since I did this, those recurrent sicknesses have vanished. I'm also taking vitamin B, which is another one you get less efficient at absorbing as you age.
Just because you get plenty of sunlight where you live it doesn’t mean you get enough exposure to produce vitamin D.
You need to expose large areas of your body like legs, arms, and torso for several minutes to an hour. If those areas are covered, sunlight won’t do much.
Nah, for me, that wasn't sufficient. I get at least 10 hours a week of full sun on my whole arms and legs and sometimes my torso if I decide to bike shirtless; more sun than most people, and still, I was vitamin D deficient.
Did you wear sunscreen? I'm pretty sure it blocks vitamin d creation, which really makes me wonder how any of my GFs Ive had ever got any since they wear it religiously in hopes of keeping youthful skin.
Did you have a challenge getting them to initiate that? I'd imagine once you're shown to be severely deficient it's easier, but there's a chicken & egg.
If your primary care provider won't do at least one vitamin-d test per year, and then periodic tests once a deficiency is identified, you should get a new primary care provider.
If that's not an option, non-prescription at-home tests are available in the US for as low as $49.
I don't know about monthly, I think a 60-day followup after treatment in order to titrate your supplementation might be best (it takes time for supplementation to work), but at-home tests are an option if your provider is for some inexplicable reason, reluctant.
As far as I can tell, the only harm from supplementation comes when the typical adult consistently consumes doses so high as to be absurd (50,000+ IU daily for months) so I don't know why any medical professional would be hesitant to investigate such a common health issue with such an easy, inexpensive, and effective treatment.
The days of tuning software have changed. In the past, these systems were indecipherable and had no good profiling or debugging tools, so they came with a lot of hardware connected which allowed you to profile and tune and debug.
If you wanted to profile PS2 code, for example, the way you did that was with a gigantic logic analyzer connected to various traces at Sony's labs. We used their Redwood City lab. You'd give Sony instrumented code, and they'd do some kind of secret magic on it and run it on this instrumented PS3, and a few days later, you get a profile! Downside, this thing is incredibly complex, but on the upside, you had little performance penalty for profiling. Later, these kinds of tools came built into dev kits.
For the PS3, programming the Cell was quite difficult and so you had to do lots of tuning. The single PPC core (called the PPU) was really slow, so you had to offload what you could to the SPE's, which were incredibly fast on floating point math, but it was on you to interleave DMA and computation in a way to get good utilization of them. The Dev kits had tools to give you visibility into this.
Programming on a laptop only requires the laptop, as you've got all the tools you need in the OS. Only apple does debugging at the board level.
I'm a Brown grad, who graduated in the mid 90's. In the mid 90's, Brown's tuition was an unthinkable $32k/year, if I remember correctly. As an immigrant with no family assets, I got a big financial aid package, and being a Rhode Islander, I also got a state grant, and took the rest as loans. It was just barely doable for someone who didn't have family assets. Today, it seems truly impossible.
Since I'm from Providence, I visit Brown regularly when I visit my family, since I have lots of memories from the time. Today's campus looks very little like it did in the 90's. We had dumpy dorms, crappy cafeterias, and rather plain, but functional buildings. Today, the dorms have been renovated or rebuilt to be much more luxurious, the school has built a number of incredibly opulent buildings, and there's just a whole lot more space and more wealth on display. The total enrollment went from somewhere around 8,000 then to about 10,000 today, so it's not that they're building these things to handle more students.
It seems that per-student spending has gone way up, and they're now providing a luxury education experience, not just a degree.
A HS friend of mine went to Penn and for a brief period of time was an admissions counsellor there. She said many times that if you were smart enough to get in, the administration would do everything in its power to get you whatever grants, scholarships, and loans you needed to get in. She was there probably 7 or 8 years and could recall less than 1/yr where someone was admitted but did not attend due to cost.
Mediocre colleges will leave you on your own 99% of the time but if you're good enough to get into an Ivy or a similarly leveled school like Stanford or MIT it's very unlikely that cost will be the determining factor (not that it will be easy of course).
It's an Ivy League, there is an implied class -- and I ain't talkin about school.
There are lifestyle expectations, either because the well off folks expect it, or those trying to gain a foothold believe that to be something that they have / will have earned.
Lab grown fish are one thing, but simple fish farms already work really well, the issues are that the fish don't eat a natural diet and are sicker and less healthy to eat. Maybe that's an easier problem to solve than growing fish meat in a lab.
It still means deforestation, biodiversity loss, soil degradation, poisoning the ecosystem with pesti/herbicides and fertilizers on land (with the associated runoff into the waterways) and a lot of organic waste on the water (each salmon farm in scotland produces as much organic waste equivalent to a town of between 10 and 20,000 people each year).
The biggest shortage in silicon valley is that of capable minds and hands. It's on you to make yourself marketable, but once that's done, there's tons of opportunity. I've been working here since 1996. I'm a grey haired old timer now, and I've seen this industry from big companies, startups, boring companies, and fad companies. Get your foot in the door, the first job won't be glorious, but as you demonstrate skill, pay and rank will follow. Don't go to any of the companies staffed by lots of startup bros, because a wrinkle or some greying hair is a disadvantage there, but there are plenty of other places to start.