Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
[dupe] 90-year-old man spends $10K on ads to tell AT&T CEO about slow internet service (news10.com)
65 points by turtlegrids on Feb 13, 2021 | hide | past | favorite | 66 comments




All I see is what amazing RoI that old man got for his $10k. I'm sure he's thrilled about reaching the top of HN multiple times in a week!


I was... $10k?? That's not what I remember.

So I looked back at the original article.

They originally said $1100, but then after I read it they revised the number to $10k.

He was more frustrated than I originally thought!


And a phonecall from the CEO of AT&T.


What's a phonecall compared to HN?


AT&T's been so slow in getting fiber laid in the north and west suburbs of Chicago that I know more than a dozen homes in line for Starlink. Family's been puttering along on 25 down/5 up for the last 4 years with an annual angry phone call to keep it under $80. AT&T's done a half dozen neighborhoods that Comcast, RCA or Verizon haven't, but none their in.


When AT&T thought Google Fiber was coming to Raleigh, NC, they magically managed to lay fiber to my subdivision on the outskirts of town. $70/mo for unlimited symmetric 1Gbps service. When they're motivated, they can apparently move quite quickly. That was over two years ago.

Google Fiber still isn't available anywhere near me.


AT&T laid fiber in my neighborhood and stopped 10 houses away in each direction. They left 20-30 houses in my neighborhood with crappy DSL. The reps I talked to said I can pay for it but I probably wouldn’t want to know the cost.

Thankfully Comcast has Gigabit service but the upstream is capped at 25mbps. I enjoyed the symmetrical gigabit at my last place.

For $2000 and $300 a month Comcast will give you 2.5 gigabit connection with SFP+ and a 1 gig failover.


> For $2000 and $300 a month Comcast will give you 2.5 gigabit connection with SFP+ and a 1 gig failover.

Time to make some friends with your neighbours.


Starlink isn't really a solution for dense areas like suburbs at the moment, maybe in future versions. Well, at least except for the few that can get it quick.


Interesting- I didn’t realize there was a limitation by density of transceivers.

Do you have a link with more info?


I don't have a link to reference but without getting into the specific details I was shared it's not really any different than any other wireless just the problem is larger. E.g. a good AP can handle 20 clients spread 360 at different distances with pretty decent bandwidth. Put all 20 clients within 1 degree and all the same distance away and it becomes a problem trying to serve that spot.


So add neighborhood wireless networks for last mile.

Lay pipe later once the city is forced to cough up a license to dig.


Whether you serve 20 clients in one spot or 1 client 20x the size in one spot the problem is still allocating the same amount of frequency to that spot. You could get away with a single much higher end receiver which would help if you designed the satellites for it but there is still a lot of low hanging fruit Starlink needs to grab before that makes more sense than finishing adding additional frequencies and shells in the first place.


Each satellite has a bandwidth limit and can cover a certain area (more satellites per area close to the poles). It's not that they "can't" it's that they won't be able to be competitive in most suburbs, at least not at this time.


Basically, fixed wireless should win every time over satellite when fixed wireless is possible.

Less path loss means more bits per Hz of bandwidth. Less exotic equipment (no beam-steering). More opportunity for densification to increase capacity. The possibility of wired backhaul.

It's only when population density is low, or fill-in for areas where towers don't quite reach, or for specialized uses (military, redundancy in access, some mobile use cases, some financial use cases with reduced latency) --- that satellite is a clear win.

Starlink can have a plenty big business addressing the most rural and fill-in, though.


Fixed wireless is plagued by line of site. The other problems haven't really been an issue for Starlink, even for being in the early phases, it's really just the density issue that bites it.


Line of sight is a bit of a pain. So is needing sky in all directions to 10-20 degrees of elevation.

Satellite is great for A) where it's not worth putting up towers to reach just a few residents, and B) filling in where there's little holes in tower coverage. (This is what I tried to say in the comment you're replying to).

> The other problems haven't really been an issue for Starlink, even for being in the early phases, it's really just the density issue that bites it.

Capital costs are high per simultaneous user served in an area. Densification is hard. Spectral efficiency is the key figure of merit (how many bits you get to users per limited bit of radio spectrum). That is, the things I mention *are* the reasons for the "density issue".


Starlink currently cuts off minimum elevation at 40 degrees, with lower shells they'd eventually like to drop that to 25 degrees. Not sure where you're pulling these 10-20 degree numbers from but allowing it to go that low would actually create more problems than it would solve.

> That is, the things I mention are the reasons for the "density issue".

The things you mentioned were path loss, no need for beam stearing, possibility of wired backhaul, and density. I'm just saying the only one of those that actually matters is density, the others are really non problems as the equipment already exists, is reasonably priced, and works fine.


> Not sure where you're pulling these 10-20 degree numbers from

Starlink uptime where I am-- SF Bay Area-- with a 35 degree elevation cutoff is 80%. 25 degree elevation cutoff is 98-99%. To get 99.5% you need to get down to 15 degrees presently.

Yes, this will improve, but for an area to share their bandwidth over multiple satellites, there'll need to be multiple satellites in the cone that that area can talk to.

> The things you mentioned were path loss,

Less path loss for a given power == more bits/Hz == more people served from one transceiver with a certain amount of bandwidth. AKA density

> no need for beam stearing

OK. That's a terminal cost issue, and isn't density. Though it's closely related to allowing the towers to have narrower coverage (e.g. sector antennae) and thus improves density.

> possibility of wired backhaul

Ground station uplink bandwidth is often limiting for Starlink. Allowing more bandwidth to the transceiver == density.

> density

Allowing densification by putting up more towers where needed == density.

> , is reasonably priced,

Right now, Starlink terminal BOM costs are likely around $2k. This can be expected to improve in the future, but electrically steered phased arrays with tons of phases are still a somewhat exotic technology. C.f. high end fixed wireless CPE selling for $200 at qty 1 retail (they, also, can be expected to improve in cost and bandwidth in the future).


You're speculating as fact an awful lot here and it's making it difficult to talk about what actually is. For instance the elevation cutoff isn't arbitrary i.e. it's not "whatever number would make it work reliably at my house". It doesn't matter if a satellite is available at 15 degrees you will not have service until one is available at a higher angle. I assume you're looking at something like (or following similar logic to) https://sebsebmc.github.io/starlink-coverage/index.html and well... there's a reason that page has 2 options in the drop down - the current limit of functionality and the proposed future limit of functionality. Also I think it goes without saying if it's not fully launched where you live then where you live is a poor choice to even try to reverse engineer numbers in the first place.

I will give a list of facts about Starlink as it currently stands, not my personal speculation or speculation from others rather knowledge from currently working with operational Starlink and many fixed wireless providers in a large enterprise network:

- Path loss is not a driving factor in the density problem (more details on why in another section but it boils down to the bandwidth losses by this are not what's causing the density squeeze). The path loss is also very different than the path loss to a terrestrial tower as there is a lot more at play than distance (atmospheric density, atmospheric composition, frequency, frequency interaction with molecules in the atmosphere, power limits for interference being different for ground<->horizontal vs ground<->up) so you can't just think "more distance therefore path loss is a bigger problem than it is for fixed wireless" either.

- The cost for the service is $499 + $99 * month for a real world expected rate of 100 Mbps/20 Mbps @ 1 TB/month soft cap w/o throttling. This pricing beats >95% of fixed wireless offerings, Verizon's 5G demo cities probably being the exception. Pricing is not a problem even if there is fancy tech involved on the user terminal.

- Ground station bandwidth hasn't been a problem for Starlink, let alone an oft problem as you confidently claim. They also continue to be turned up at the same rate as new satellites go active so no need to speculate their either.

- It's quite easy to add both more towers and satellites but neither is actually a realistic solution to their respective spectral density problem as for towers density comes to the point the cost savings of using towers for the last mile in the first place go out the window and for satellites the spectral density problem is on the ground. Remember it's not going to be a fully shared collision domain like Wi-Fi where you just spread the same frequency to everyone attached in a sector.

- Both can be expected to improve drastically in the future. Starlink is barely deployed enough to function at all at the best geographical locations, many of the assigned frequencies aren't in use, satellite<->satellite still hasn't been deployed, the lower shell hasn't even been started, only a handful of ground stations are up. Similarly 5G (or the latest 4G) is still early in deployment in the US, good & cheap 5G hardware is just really becoming available at scale, network deployments are really kicking into gear.


Spectrum is different with satellites as it’s using directional antenna. In theory you can have nearly arbitrary satellite density depending on efficiency. The only effective limit is the user density you’re aiming to support.


Fixed wireless uses directional antennas, too. Towers can have sector antennae, and you can have multiple towers servicing an area, too. (Indeed, small LEO satellites' ability to spot beam is limited, but sectors on fixed wireless can provide a -lot- of densification).


Directional antennas on the ground are a much higher risk from obstructions. Multipath propagation is essential for large wireless networks without building giant towers everywhere.


> Directional antennas on the ground are a much higher risk from obstructions.

Giant towers / hillsides for the central point; small towers or rooftops for the other end.

Between two hills over my suburb, you can see 80%+ of houses. If your point is that this doesn't work well for dense cities: you're right, but it breaks down at a higher population density than satellite does.

Relying upon multipath and diffraction can work to get moderate quality LTE service everywhere, but in general this is not what WISPs are doing, because it doesn't work. (And in any case, MIMO + multipath are friends, too, which provides its own densification, and sector antennae are effective, too).


This is why WISPs are so location specific. With the right geography, population density, etc it can work but we don’t have these companies going nationwide.


We don't have these companies going nationwide because it's one of those things that's inherently a small to medium business. It doesn't work places with high population density. It doesn't work places with very low population density. It requires compliance with local codes and a relatively high amount of real estate work per subscriber. They don't benefit from legal frameworks intended to make things better for cellular carriers.

Satellite is great for very low to low population density, and fill-in coverage in medium population densities.

Fixed wireless is good for low to medium population density in most geographies, and fill-in coverage in high population densities.

Wired internet access is good for places with medium to high population density.

Cellular can provide fill-in coverage for medium to high population densities.


Satellite population density is a little different as individual Starlink satellites cover a ~580 mile radius circle. Illinois is only 232 people per square mile and near the Great Lakes so as far as the satellites are concerned, and excluding the core urban areas with high speed internet, the area has a fairly low population density.

The Northeast megalopolis is a larger issue, but again it’s next to the ocean which makes a huge difference. Further including launch costs it’s ~2 million a satellite. Depending on lifespan break even could be below 1,000 customers per satellite which could support a very dense network. Especially if they charge more for aircraft and boat internet access.

PS: Current Satellites are 20GBPs, so if they average ~1000 people per satellite that’s 20 MBps bandwidth per customer. Traditional home internet can be 10x oversubscribed, but assuming they lose a lot of that to low population areas 1-2x ~= 20-40MBPs in high density areas during peak usage periods.


My town and the town next to it have 25,000 households. If you want to provide 50 mbit/sec to 20% of households, while oversubscribing by 50x, ... you've basically used up 20-40% of a simultaneous-overhead-satellite.

At any time, you can expect there will be dozens of towns of my size in the footprint.

(And, well, most of the time, the entire SF Bay Area, and the Monterey metropolitan area).

Starlink can sell tons of subscriptions, but they can't address suburban connectivity issues.

An alternative analysis, looking at the FCC RDOF areas where funds were awarded to Starlink (completely unserved by broadband-- very low density areas) areas says that Starlink can address around 50% of these households -- not counting any capacity sold to nearby users in cities or towns -- and provide 15mbps peak hour usage-- which is expected to be typical peak hour usage in a few years. https://ecfsapi.fcc.gov/file/10208168836021/FBA_LEO_RDOF_Ass...

P.S. I see your edit and that now you're making an economic argument. I totally think Starlink can be profitable. I don't think Starlink can address anything but the least dense areas and occasional fill-in in other areas, but these are still very large potential markets. As to supporting a very dense network... there are reasonable limits on how many satellites we can expect to have overhead.

P.P.S. gbps not GBps. Currently thought to be 10gbps, but also it's thought they will not have much issue reaching to 20gbps.


Earth only has 196,900,000 square miles. A 580 square mile circle is 580^2 * 3.14 so a little over 1 million square miles. At a minimum that 12,000 satellite network means ~60 are within range at the same time. At 42,000 your at ~210 overhead at the same time. While nobody can see every part of the sky, across tens of thousands of people plenty will get at least a slice near the horizon. So, you can include satellites out over the ocean and exclude people with access to high speed internet already.

Overall a 42,000 satellite network could connect something like 1/2 the “1.3 million people in California without access to a wired connection capable of 25 Mbps download speeds.” As I doubt 1/2 those households are going to want to pay 100$/month for internet the 42,000 satellite network should cover California just fine.

PS: Of note their ground receiver can only point to a slice of the sky, but I suspect plenty of people near the ocean will focus on that slice of the sky for better bandwidth.


> At a minimum that 12,000 satellite network means ~60 are within range at the same time.

It doesn't work like that, unfortunately. It's not uniform and satellites in highly inclined orbits spend more time around the poles.

580 miles radius is also far too much. This may -barely- work for the outer shells, but the link budget and rates get worse there, and of course, these outer shell satellites need to be shared over an area enclosing more population.

Of course, link budget, data rates, and spectral efficiency also get worse as the satellites are lower in the sky, too.

Of course, receiving from an arbitrary subset of 60 transmitters occupying the same frequency band with an electrically steered phase array gets pretty hard when there's a big difference in the received power levels between the transmitters, too.


Sure, it’s also worth nothing that satellites over the ocean may not have a direct connection to a ground station or eventually be relaying long distance connections.

That said, a ground station on the coast connecting to a satellite over the ocean has a huge impact on density calculations as so many people live along coastlines. Especially so when looking at Hawaii and other islands.

However, this is all rough order of magnitude calculations. Starlink’s consolation is optimized for it’s potential customers rather than simply aiming for uniform global coverage. https://i1.wp.com/starnationsnews.com/wp-content/uploads/202...


Seriously, go read the analysis I linked. I was pleased to see it aligned well with my back of the envelope numbers, and it seems to say that Starlink has a bit of a potential challenge in reaching the underserved populations that it's receiving FCC subsidy to address... They've got a fairly decent model of number of satellites overhead vs. number of subscribers in that area (and assume no one outside of the subsidy customers will buy, which is pessimistic).


I did, but I will more directly address where they are being overly pessimistic. For example, they are likely to prioritizing US coverage which could reach desired goals before completing the global network.

Allocate 12,000 satellites equally spaced across 72 planes to approximate the Starlink fleet That’s a reasonable criticism for 2028, but not their actual goal.

Assuming a 70% broadband uptake rate of assigned locations Clearly not the goal at 12k total satellites otherwise they would not be aiming for a larger network.

500-km coverage radius Doesn’t seem to be accurate based on other sources, but I would accept an actual source such as a current user.

For RDOF locations, we have uplifted these estimates of peak usage to establish a minimum capacity required of 3.6 Mbps per subscriber That’s not how users behave, normally people use zero, ~maximum, or stream a specific amount of bandwidth. Which responds to the maximum available bandwidth.

Anyway, they control the rate and geographic location of new users and can therefore maintain minimum bandwidth standards by slowing adoption based both their current network and user behavior.


> For example, they are likely to prioritizing US coverage

This is difficult, because the earth spins under the orbital planes.

> Assuming a 70% broadband uptake rate of assigned locations Clearly not the goal at 12k total satellites otherwise they would not be aiming for a larger network.

This is the locations that Starlink has bid and received FCC subsidy for to provide connectivity for / receive subsidy for that are completely unserved by broadband.

> 500-km coverage radius Doesn’t seem to be accurate based on other sources, but I would accept an actual source such as a current user.

It's somewhat pessimistic, going all the way down to the elevation limit. tangent(55 degrees) * 550 kilometers = 785.481404 kilometers

At the same time, it's not likely you'll often want to be talking to a satellite at the elevation limit, as it'll be 6-7 dB+ further down even before taking into account the phased array will offer less gain.

> For RDOF locations, we have uplifted these estimates of peak usage to establish a minimum capacity required of 3.6 Mbps per subscriber That’s not how users behave, normally people use zero, ~maximum, or stream a specific amount of bandwidth. Which responds to the maximum available bandwidth.

Forecast average peak hour demand was what was used to size these numbers, which is reasonable. Assuming stochastic demand with the same average makes this worse rather than better.

> Anyway, they control the rate and geographic location of new users and can therefore maintain minimum bandwidth standards by slowing adoption based both their current network and user behavior.

Starlink has committed to provide service to these users as a term for receiving these FCC subsidies.


It should be clear from context but:

* A 580 square mile

A 580 mile radius


I don't know how many people can share a satellite.

They're 340 miles up, and traveling 17,000mph (I think?)

wonder how many satellites will be in view, for how long.


1,400 satellites is enough for global coverage but their aim is 42,000. Which suggests ~40+ are going to be in range at one time.


I think this is an important puzzle piece in "distributed living". Pretty soon we'll be "off-grid" with good connectivity plus the other parts like power for living and transportation without being super-expensive or super-low quality of life.


> We’re going to see what we can do for you,” he says they told him. He figured if they put in fiber optics in his area, it will improve speed for his neighbors too.

Honestly, I’d consider paying $10k for fiber internet run to my (relatively) rural house.


I had the same experience with AT&T fiber Facebook newsfeed ads. I keep getting them while browsing at home, and when I click through it simply tells me it's not available at my address. I ended up signing up for non-fiber AT&T through that and I still get ads on FB for AT&T Fiber.

You'd think with all the invasive tracking FB would be able to optimize conversion rates and ad spend for their customers even a little bit but I guess they don't really care about optimized spending.


Advertising for a practically monopoly like AT&T is like those advertisements to fly Aeroflot back in USSR where they were the only airline. The AT&T advertisement budget just needs to be spent and FB is happy to help.


I wouldn’t characterize AT&T as a near monopoly (at least in ISPs). There are multiple DSL providers plus Comcast for cable). This is just pure marketing waste.


All I have is Spectrum in my neighborhood despite being in the middle of a huge metro area. Everyone I work with has AT&T and I am stuck with Spectrum's over priced slow lane as there are no alternatives. Maybe Starlink will finally provide some competition some day.


“ Our European visitors are important to us.

This site is currently unavailable to visitors from the European Economic Area while we work to ensure your data is protected in accordance with applicable EU laws.”

The GDPR went into effect almost three years ago. I guess we’re not that important to news10.com.



And who is AT&T?


A dumb pipe company.


Clearly we're not important, otherwise you guys would follow the rules. Actually blocking the access is against the law.


Expecting a random local news station in the US to subject itself to European jurisdiction and follow European privacy laws is a bit of a stretch, IMO.


Shutting down complete access instead of holding back a few cookies is a bit of a stretch as well.

If you must fail, please fail-open then.

And if you decide to fail-close, don’t lie: We’re unimportant to you so you decided to fail-close, and you’re not “working” on anything. You think it’s ok and it can be left like this.


Ha! You attempting to access a computer service that you aren’t supposed to have access to is against the law.


Anyone else amazed that those ads only cost 10k?


I can't remember the last time I saw a print copy of the wsj or nyt that wasn't in a hotel lobby. So it seems a little high to me.


It was only in the Dallas and NYC regional editions, but still, yeah.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: