Hacker Newsnew | past | comments | ask | show | jobs | submit | kwaugh's commentslogin

> A double chainring setup likely offers smaller gaps between gears but likely a comparable overall range from lowest to highest.

This isn't true. This bike is using a 40 gear crank in the front with an 11-40 cassette in the back. That puts the gear ratio at 1.0 on the low end and 3.64 on the high end. The corresponding Shimano GRX setup with a double chainring would be a 46-30 in the front and a 11-34 in the back. This gives a 0.88 ratio on the low end and 4.18 on the high end, which is actually quite a big difference. One could argue that the difference in gear ratio on the low end isn't a big deal because this bike has pedal assist, but the difference on the high end is quite noticeable.


No one (basically) is going to care about the difference between a 46/11 and a 40/11 when the assist drops out at 25kph. If anything, you’re much more likely to be able to use the 40/11.

At the other end, the assist is going to cover a lot of gearing, and it’s going to make the gaps in a 1x setup a lot less important.

(FWIW, I ride a 1x42/11-42 and a 2x 50/34-11/32. And a triple on the tandem, and I lived with a cx geared 1x 39-12/28 for a few years. High gears are drastically overrated unless you’re racing)


I definitely agree that high gears are overrated except for racing, but when the bike markets itself as a lightweight semi-aero bike, isn't the point to go fast? If you're not going faster than 25kph, then why are you need deeper rimmed carbon wheels?


The 25kph cutoff is the legal limit for this category of ebike in the EU, 20MPH in the US.

I think some of the price of the carbon comes out of the marketing budget. In a way, the difference between a 4k ebike with cheaper looking parts vs a 4.5k bike with "premium" parts comes down on the side of spending the money.

I'd estimate that bike would sell for ~2k without the electrics. It's nice, but it's not super speccd. High end carbon rims are expensive, but they're not _that_ expensive compared to aftermarket aluminum rims. Niceish rims are basically 80-100 now. Carbon is probably 150 or so. (That's retailish, but not MSRP, what I could get them for, not what a bike co could get them for wholesale)

OTOH, it's lighter than my gravel/commuter, which doesn't even have electrics, so there's that. But I'd need rack/panniers for the commute, and I'd have range worries there.

What I'd like to see is a lightweight system that adds ~100W to my output for ~ 4hours. Something that's not painfully heavy if it's off, but makes the last hour of the commute better. (I do 2hrs each way when I do it.)


Only if you're in a hurry.


I mean no offense, but many of the advantages that you've cited are more of claimed advantages from the marketing departments of big manufacturer rather than proven advantages that the tech gives you. The author of the comment to which you replied does seem to understand the trends in bike designs these days and feels that these trends do not fit well into the market segment of e-bikes. I agree.

> it’s not only the weight that carbon frame bike owners are after - the material gives much better ride quality than aluminium.

The affect the frame has on the ride quality is significantly smaller than other components of the ride like the tires and seat post[0]. Carbon often has a very small weight improvement over a well made steel or aluminum bikes, so if you're buying an e-bike because you want pedal assist, I don't understand the need to shave 1 or 2 pounds off the bike.

> Deep rims are nothing spectacular either, their weight penalty is nicely offset by aero gain.

This appears to be true. Most of the aero penalty of the bike comes from the wheels and having deeper wheels does reduce the drag a lot.

> One chainring is all the rage in bikes now

It is all the rage, but it's not clear yet whether it's justified or just a ploy by the manufacturers to save costs on their end. The gear ratio of this bike goes from 1.0 on the low end to only 3.64. 1.0 is pretty good for going up hills, especially with the motor assist, but 3.64 is really low, especially for a bike trying to tout itself as really fast! I don't understand this choice. Most road bikes have a top end ratio of around 4.54.

> Mudguards and simple handlebars make this bike utilitarian and well-suited to commuting

The mudguards are nice, but I don't understand the choice of handlebars. Why do you want a super aero light-weight bike but then put flat handlebars on it so you can't get in an aero position like you can with drop handlebars. This doesn't make sense. Furthermore, the bike doesn't have mounting points for a front or rear rack, which is a huge downside for commuting, if not a dealbreaker.

[0] https://www.cyclingabout.com/why-impossible-steel-frames-mor...

[edit] you're right that trends do matter when affecting people's purchasing decision since most people aren't very informed about the actual pros/cons of the tech they're buying, but this doesn't undermine the original commenter's opinion that this bike doesn't make sense from a technical standpoint.


I agree with most of your points, especially that the demand is strongly influenced by marketing/fashion and percieved values of the product, not objective ones.

Diving deeper into gears and handlebars - I believe it’s a very consious choice.

- max speed with an optimal cadence (90 rpm) in this setup seems to be around 43km/h. This is plenty fast and definetely above typical commuting speeds. Even the electric support is more about better range/less sweat than making the bike fast, speeds appproaching 50km/h are close to the legal limits anyway

- with straight handlebars it’s comfortable to ride, no matter the level of personal fitness. Drop bars make you bend more - this isn’t very inviting and could drive off the customers that are accustomed to relaxed positions (that they know from riding in their SUVs daily). And with speeds mostly well below 40km/h there is no need to curl into the aero postion. Just put on some trendy glasses and the electric support will deal with the extra drag.

It really goes hand in hand and I’d be suprised if such bike would be released in other config.


Those are fair points. I guess I'm most confused by the inclusion of mid section rims. This makes me think they're targeting people who want to go really fast, but then their other choices don't pair well with going fast enough to get the gains out of the wheels.


Apple handed out free cases for the iPhone 4 for a while to mitigate issues where the cell signal would drop when you held the phone a certain way. This type of action wouldn’t be unprecedented.


They made and sold the cases though before it became a workaround.


They let you get third party cases valued at up to ~$30 for free. This wasn’t just for Apple branded cases. In fact, I believe this was before Apple sold first party iPhone cases.


No


> Now, that being said, there is one minor question I have, and that is, how would backpropagation apply to this apparently one-way model?

The author mentioned that this is only for inference of neural networks (not training), so this does not support backpropagation.


This kind of misses the point of the Unix philosophy of being able to dynamically reconfigure things - realistically, to get decent results, you'll need to do inference with the exact same connections as you trained (or at least finetuned) them, so there's no good reason to split the model in smaller parts.


I think that there is room for this idea at a higher level of abstraction where pre-trained sub networks are exposed as command line utilities.

cat tweets.txt | layer language-embed | layer sentiment > out.txt


My point was that this is not possible as the trained layers are intrinsically tightly coupled. You can't combine pre-trained sub networks in arbitrary manner without retraining. In all the standard practice of reusing pretrained networks, you would take a pretrained network or part of it, and train some layers around it to match what you need, optionally fine-tuning the pretrained layers as well. If you want use a different pre-trained embedding model, you retrain the rest of the network.

In your example, the sentiment layer will work without re-training or finetuning only if preceeded by the exact same language-embed layer as the one it was trained on. You can't swap in another layer there - even if you get a different layer that has the exact same dimensions, the exact same structure, the exact same training algorithm and hyperparameters, the exact same training data but a different random seed value for initialization, then it can't be a plug-in replacement. It will generate different language embeddings than the previous one - i.e. the meaning of output neuron #42 being 1.0 will be completely unrelated to what your sentiment layer expects in that position, and your sentiment layer will output total nonsense. There often (but not always!) could exist a linear transformation to align them, but you'd have to explicitly calculate it somehow e.g. through training a transformation layer. In the absence of that, if you want to invoke that particular version of sentiment layer, then you have no choice about the preceeding layers, you have to invoke the exact same version as was done during the training.

Solving that dependency problem requires strong API contracts about the structure and meaning of the data being passed between the layers. It might be done, but that's not how we commonly do it nowadays, and that would be a much larger task than this project. Alternatively, what could be useful is that if you want to pipe the tweets to sentiment_model_v123 then a system could automatically look up in the metadata of that model that it needs to transform the text by transformation_A followed by fasttext_embeddings_french_v32 - as there's no reasonable choice anyway.


Yes. I understand how neural networks work. In my example language-embed and sentiment are provided by layer. This allows layer to provide compatible modules. If two modules which are incompatible are used together they might provide junk output. That is true for any combination of command line utitilies. If I cat a .jpg I'm going to have a hard time using that output with sed.


Aarsonson has since commented on this paper on his website. He still believes that it's a bogus proof.


Aarsonson has been one of my favorite professors. His undergrad quantum information sciences class was great. Super smart guy, approachable, good lecturer, good person. A+ dude. I recommend following his blog if you don't already.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: