Book review: The 4-hour workweek by Tim Ferris
2020-04-26 16:53:57


This book is very well know by the community of digital nomads and marketing “entrepreneurs”. I like the way that the author by stating his own experience, invite the reader to rethink common value that we take for granted, for instance the 8-hours, 5 days work week. I also like that fact that the book is not about having a magic formula, but that the reader should try by himself and find the life that suits him the most.
One enlightening little story that Tim Ferris tells is the fact that he was the improbable champion of a martial art tournament, not by being the best fighter but by actually reading the tournament rules and pinpointing that it was possible to achieve victory by pushing people outside the ring area, thing that nobody had noticed before.
The title of the book, is knowingly misleading and the author admit it, the title was chosen after an AB test to find the title that will catch the read the most. Instead, Tim Ferris invite the reader to look for the right work-life balance. And also that there is a relativity of wealth, that everybody has learnt to acknowledge at one point between getting a highly paid job is an expensive area and low paid job in a cheap area. The higher standard of living my not be where the high paid job is.
Tim Ferris’s book is mostly common sense, but it’s one of these book that everybody has read so you have to.
Amiga, the computer I wish I had
2019-12-03 22:51:32


Back at the beginning of the 1990s there was a war that outclassed even the Gulf War of the same era, it was the geeky war between the Atari ST and the Commodore Amiga.
As a little kid looking the buy his first computer, I wasn’t very computer savvy at the time.
The Atari ST and Commodore Amiga represented what I could afford at most, it was a stretch with machines selling for 5000 French Francs with screen, considering that our household income was 8000 Francs per month at the time.
The very model I bought is an Atari 1040 STE boasting 1 Mb of RAM, 720kb floppy disk, Motorola 68000 at 8Mhz. The experience might look like no different from what we use currently : graphical GUI, word processing, drawing software, and a solid library of games.
Anything to be happy with, but I couldn’t help but looking at what happened on the other platform, the Amiga. The Amiga 500+ was a very similar machine, but in a sense it was slightly better in many areas that overall makes it a vastly superior machine for a computer kid like me.
The Amiga could be considered the first multimedia computer before it was a thing.
* 32 colors on screen, with a large palette of 4096, vs 16 colors out of 512 on the Atari ST. It might not seems a big difference but with such a low starting point, even a little more is a good thing.
* 4 voices digital samples playback, the Atari ST only has 3 voice FM, very 8bit-ish.
* Hardware accelerated graphics, the Motorola 68000 is no powerhouse, I’ve very rarely played games with smooth scrolling on the Atari ST. I had to wait until the Super Nintendo to enjoy smooth action games. (Actually before that, I thought that it was normal to play games at 10 FPS)
* Multitasking operating system : that is huge, the Atari ST GEM system is very bare bone although serviceable, on the other hand Amiga OS was very ahead of it’s time OS programmed by a small team of geniuses.
Computer of the era were not ready for full fledge operating system, namely without MMU (memory management unit) to isolate each process. The system is slow and unstable, but hey, it’s Linux before Linus wrote its first line.
From a productivity stand point, the Atari ST was a very solid machine for the time, especially with high resolution monochrome screen, for word processing and other serious stuff. It was a cheap Macintosh, but hey ! I was a little kid, all I wanted is play fun games, listen to music, and learn about computers. The Amiga was the better deal.
Besides I bought my machine in 1992, and the machine’s commercial life ended in 1994 with product, games and softwares disappearing from the shelf. So I basically bought a machine and learnt to use it 2 years before it became obsolete.
Eventually custom built home computer disappeared, facing the drop in prices of the PC compatibles. I bought my PC Pentium in 1996 with Windows 95 for 6700 Francs, and the followed the PC upgrade race. I still have fond memories of the 16 bit home computer era, with it’s highly unstandardized hardware and software, mostly because it was my first computing experience I guess.
Have I bought an Amiga in 1990 something, I would have enjoyed 6 years of computing and gaming stuff.

The 16 bit era

It was a particular era, in a sense that, as computer gets more and more complicated, it becomes harder to understand the computer as a whole. Thus a layer of abstraction would hide the complexity of the system only exposing a simpler API, for instance, today nobody addresses GPU at the register level, except the drivers programmer, same can be said about many things, storage controller, keyboard input and so on….
This is a change that occurred during the 16 bit era and also on the PC side, where you would talk directly to the hardware for instance, in the DOS operating system which is a very bare metal operating system.
Processing performance and memory was also very limited during this time, hence to make the most out of the hardware, people would often find themselves coding in assembly rather that using high level language.
I really started programming in 1996/1997 using the legendary Turbo Pascal. My first programs were DOS games, using VGA graphic mode 13h, aka 320x200 256 colors. I was already late to the party as game development shifted towards Windows and DirectX (which is big layer of abstraction). In a sense, I’ve never found the same connection with the hardware that I had when I was programming back then, with interrupt, hardware registers and raw memory addresses.
For the legacy, the Amiga computer will remain a footnote in history of computers, eaten by the PC hegemony. The war between the Amiga and the Atari ST, had fragmented an already thin market, the “upper low-end market”. Games were tailored to the inferior hardware, the Atari ST, and the market was never sufficient to make it worth to optimize the code for Amiga hardware. The market for commercial games on the Atari ST and Amiga itself was small compared to the console game market.
Anyway the best computer to buy in 1987 was an Amiga by far, in 1992, it was probably best to buy an IBM PC compatible as technology has moved on.

This figures clearly shows that the Atari ST and Amiga were a niche market even in its time.

The Amiga

What constitute the Amiga : architecture and custom chips were designed in 1983, the first Amiga, the Amiga 1000 was released in 1985.
The very concept of its architecture allows for high level of integration and performance for the time. It allows for unprecedented graphical and audio performance for a home computer at a competitive price (around 3500 FF for the central unit).
Sadly this architecture as seen little evolution during its lifetime, Amiga 500 was the low cost version of the same hardware, then Amiga 500+, Amiga 600, in 1992, the biggest evolution was the Amiga 1200, the last consumer grade Amiga. The little improvements it provides proved not enough to match the PC, which price dropped to Amiga s level.
The technological edge was lost by R&D cutting cost and short-sightness. Maybe there was no enough business to be made selling hardware in the low-end market, home consoles for instance are sold at loss, the profits are made on each games sold. Commodore and Atari selling computers do not make a single penny on software sold on their platform. Hence those company will try to make their own console : Atari Jaguar, Amiga CD32. This is were the real profits are to be made.
Another way to make a sustainable business selling computers is to target the high-end consumers : just like Apple did, but even then, lets not forget that what saved Apple was the iPod....
Temple OS and the quest for greatness
2019-11-06 17:45:40

TempleOS is a the work of a schizophrenic programmer called Terry A Davis. It’s an operating system with a compiler and a library of applications written by a single developer during 10 years.
Programming all this is a true manifestation of stubborn devotion for a single person, but it’s also the testimony of a mad man. Who can commit such an enormous amount of time to building something that would just felt like a curiosity to many.
It’s impressive as it is but doesn’t bring anything new or revolutionary, at worst, some may say it feels seriously outdated and quirky.
Or maybe it’s the stubbornness to achieve something regardless of social reward that can bring birth to the ground breaking new product. Maybe it’s the stubbornness to write or to paint relentlessly that creates famous writers and painters.
It happens that Terry A. Davis was also schizophrenic and claim that he wrote TempleOS upon hearing God’s voice. One can wonder whether it was God’s voice that directed him towards this crazy project, or maybe It was something that he came up to justify to himself having invested so much time and energy in it.
In his Youtube broadcasts, Terry A. Davis, modestly claims : "I am the smartest programmer that have ever lived !".
Maybe that "smart" side is not to waste 10 years of your life writing a software that nobody understand or want, but to lay down yours ego and support community projects, or it is to share your knowledges and skills with others. Or maybe smart is simply to build a family and not pursue meaningless goals.
Greatness is an uncertain path, having a life is the safest one.
For sure, as a programmer myself, I can be carried out by a personal project to the point of losing track with time and reality. Today programming is more of a collective effort, but I truly feel good during long coding sessions, lonely, when I’m in the “zone”.
On the same subject, I enjoyed watching the show from Netflix, Bandersnatch as I can truly identify to this story. Coding a game at home night and day can alienate you from reality, especially as the main character as already some personality disorder that requires medical treatment.
Ultralearning
2019-10-02 22:20:30


I believe that learning and being able to learn is an ever important element in one life. Actually learning and constantly improving is what drives my entire life.
Ultralearning is a book that addresses the subject of how to learn quickly and efficiently. I think the core principles are pretty straightforward :
Meta : knowing what you need to learn, create a map of the subject, set goals
Intensity : learning process should occur in an intense manner
Practice : tackle the subject by directly and actively addressing it
Active learning : don’t just read a book an the subject, write flashcards and reminder, being active while learning
Feedback : Have someone correct you and evaluate the quality of your learning
By following simple core principles the author claims optimistically that a lot of knowledge fields are actually within learning reach of almost everybody, not only geniuses.
You want to learn japanese ? Go to Japan for 3 months, only speak japanese discovering new words every day, practicing, and actively remember words using flashcards.
You want to learn programming ? Code intensively for 3 months.

I heavily subscribe to these ideas. In a personal view, I did learn to play tennis a few years ago.
While being seriously bad at it, my level only started to improve when I’ve been able to practice every single day for many hours each time.
I thought that I was good, but then I was badly defeated by people with better technique who were able to hit stronger with more consistency. I realised that I still didn’t know how to play tennis, that I needed to learn the proper technique.
I started to film myself and compare my hit motions to good and professional players, then I would change and fix my technique.
To learn properly, you also need to be self-conscious : know your strengths and your weakness and work accordingly.
My journey to being a tennis player got me through all those points : meta, intensity, practice, feedback. So not only did I learn tennis, I also learnt to learn in some way.
Little Tic Tac Toe game
2019-02-06 21:04:26
A small application in ScalaJS / React Native:

http://tictactoe.nicolasmy.com/
A React Journey
2018-11-28 13:32:41
I’m not a front-end dev per say, but I like to keep myself up to date with latest framework. After all, knowing what’s considered a good framework in an ecosystem can help engineer better framework in my home ecosystem.

React is a JS component oriented presentation framework
React aimed at creating single page application front-end
React is based on virtual DOM, an in-memory representation of the dom document : whenever a component is updated in the virtual DOM, only the changes in the virtual DOM nodes are applied to the actual DOM.

 
class SignUpForm extends React.Component {
 
handleSubmit = (event) => {
RestUtils.performRestPostReq((token) => {
this.setState({activation: true})
}, "/user/signup",[ ["email", this.state.email], ["password", this.state.password] ] )
event.preventDefault();
}
 
constructor(props) {
super(props);
 
this.state = {
email: ',
password: '
,
activation: false
}
}
 
render() {
return (
 
<span>
{ !this.state.activation &&
<Panel title="Sign up form">
<FormContainer handleSubmit={this.handleSubmit} submit="Sign up">
<FormTable>
<FormRow label="Email"><FormTextField value={this.state.email} name="email" handleTextChange={(event) => this.setState({email: event.target.value})} /></FormRow>
<FormRow label="Password"><FormPasswordField value={this.state.password} name="password" handleTextChange={(event) => this.setState({password: event.target.value})} /></FormRow>
</FormTable>
</FormContainer>
</Panel>
}
{ this.state.activation &&
<Panel title="Sign up form submitted">
An email has been sent to {this.state.email}, follow the included link to activate your account.
</Panel>
}
</span>
)
}
}
 


React components extends React.Component.
Immutable parameters passed to the component are available in the “props” object.
States are accessible using the “state” object. States are only mutable using the setState() method.
The render method returns the presentation layer as HTML tags and calls to React subcomponents. Hence forming a “component” hierarchy.
JSX file format allows the mixing between js and html tag style code for component calls. JSX need to be compiled using the JSX to JS compile : Babel.

Pure component (ie: component who do not hold states) can also be declared using a simple function returning the presentation layer :
 
function FormTextField(props) {
return <input type="text" value={props.value} name={props.name} onChange={props.handleTextChange} />;
}
 


React guidelines recommend to only keep states in the higher level component and passed down state as props in the subcomponents hierarchy : “One single source of truth”.
Very well, but React is rarely used only by itself. In large applications states are shared and muted from multiple subcomponents. Here comes the need to segregate the way that state can be muted in a single place. All application states are held in a “store” object and would only be muted using “actions” items dispatched to a “reducer” function.


Our Sign Up form, the Redux way :

A pure component:
 
export const SignUpForm = ({activation, email, password, formSubmit, formEmailChange, formPasswordChange }) => (
<span>
{ !activation &&
<Panel title="Sign up form">
<FormContainer handleSubmit={formSubmit} submit="Sign up">
<FormTable>
<FormRow label="Email"><FormTextField value={email} name="email" handleTextChange={formEmailChange} /></FormRow>
<FormRow label="Password"><FormPasswordField value={password} name="password" handleTextChange={formPasswordChange} /></FormRow>
</FormTable>
</FormContainer>
</Panel>
}
{ activation &&
<Panel title="Sign up form submitted">
An email has been sent to {email}, follow the included link to activate your account.
</Panel>
}
</span>
)
 


A reducer function to process actions:
 
export const getSignUpForm = (state = {email: "", password: "", activation: false}, action) => {
switch (action.type) {
case "SET_EMAIL":
return {
...state,
email: action.email
}
case "SET_PASSWORD":
return {
...state,
password: action.password
}
case "SIGN_UP":
RestUtils.performRestPostReq((token) => {}, "/user/signup",[ ["email", state.email], ["password", state.password] ])
return {
...state,
activation: true
}
default:
return state;
}
};
 


A Redux React component is created by connecting the pure component, the reducer, a mapping between the store and the components props, an action dispatch who will send actions to the reducer from the component generated events.
Introducing "Crypto TradeBot"
2018-11-22 16:38:07
https://www.crypto-tradebot.com

Why “Crypto TradeBot”

From a personal perspective, I did ride the crypto 2017/2018 rally, although great from a financial point of view, it has been very tough for me on the mental side : how to react, should I sell ? buy more ? buy the dip ? cut my losses ? fear, doubt….The decision making was incredibly difficult and I ended up make no decisions or bad ones.
Besides the decision making part, I despite the incredible amount of time that was spent looking at charts, youtube videos and articles. I want my life back !
I now know it for sure : time is the most precious resource one has, but time is nothing without financial independence.
The current crypto rally might be over, but I won’t miss the next one thanks to my ultimate effort : “Crypto TradeBot”. A trading system that will help me make the right investment choices while mitigating risks and protecting my gains. So here is my effort.

More free time and lots of money, gotta love it !
Book review
2018-10-01 17:13:21

Game Engine Black Book: Wolfenstein 3D will revive memories of the time long gone where programming was a bit of sorcery. Algorithm were by then evaluated by CPU cycles they will consume (and of course, every programmer would know in that time the cost of each CPU instruction).
Wolfenstein 3D is celebrated today as being the first First Person Shooter in history. I remember having played it on my cousin 286 at sluggish speed. It was a good game but nowhere as striking as it successor : Doom. The 3D was … very flat with no lighting, the environment was … cubic. Nevertheless, it paved the way for whole new genre of video game although I clearly remember having played a First Person 3D game on the Atari ST before, namely "Midwinter".
Most of all, one can say that it started a new era : the era of the PC gaming. Back then, PC were considered serious appliance for serious business, with the corresponding price tag.
These new action 3D game that were only possible on the PC in that time, due the required raw power only available were the the downfall of the proprietary home computer like the Amiga and the Atari.
It was kind of a surprise because the PC wasn’t meant to be a gaming machine from the start and had clearly some serious limitations :

* 640 kB limit for executable code due to the horrendous DOS operating system legacy compatibility requirement
* Memory manager software needed to access memory beyond 1024 kB with only 64 kB available at time via page-flipping logic
* Additional sound card required to output any quality sound with competing standard

One the programming side of things, I would say that this book is more about storytelling and trivias.
Actually, Wolfenstein 3D rendering technique has already been described in an older book : “Michael Abrash s Graphics Programming Black Book” which is an everlasting classic for those willing to crunch pages of ASM x86 code.
Besides, the story of IDSoftware, the company behind Wolfenstein 3D, Doom, Quake, has already been covered it the excellent book : “Masters of Doom: How Two Guys Created an Empire and Transformed Pop Culture”.
All in all, I don’t who is the book for, as the aforementioned books are much more complete and extensive in their own right. I guess it will appeal to all the die hard vintage programming nostalgic out there.
Ethereum ming outcomes
2017-07-05 20:48:49
At present day, Ethereum is hovering around 270 USD, difficulty is continuously increasing making mining less and less profitable each day passing.
GPU are in low supply due to people buy GPU to mine Ether.
So what the outcome of all this could be :


* Price drop : every new miner won't ROI there investement, huge loss for lot of people who mine hoping for a price increase in the future
* Price stay steady : slowly difficulty increases eating up on mining profit to the point where ROI will take up 6 month to a year. Bad operation for all new miners.
* Price increase to new ATH, 500 to 1000 USD : Huge profit for miner who could afford to mine at a loss waiting for the right time to sell.

Mining is more and more becoming a gamble where people would mine hoping for the price to go up.
It also become more and more an attrition war where only the most efficient miner will be able to put up gains.
Personnaly my rigs are already ROIed, so I might just continue and hope for the best. It's the thrill of life after all.

Failure of the Peachy Printer Kickstarter
2017-05-31 22:27:17


I didn't heard about the Peachy Printer until recently when I found out about a kind ridiculous video where the project author declared having lost all the money to his crook associate.

https://www.youtube.com/watch?v=80HsW4HmUes&t=149s

What is it about ?

The Peachy Printer was a $100 3d printer project launched in 2013.
To reach such a low price point, the creator of the project claim having completely rethink the way 3d printer are built.
Therefore, the Peachy Printer would use a stereo lithography system : that liquid resin solidifies when hit by a laser beam.
The only moving parts are mirror directing the laser to part of the resin to solidify.
The level of resin will raise during the printing by adding drop of salt water which would sink underneath the small layer of liquid resin.
This technique is suppose to be very simple and requires very few moving part.
To simply things further and to save driving components :
* Mirror rotation will be driven by an analog signal coming from the sound card.
* Water drop feedback will be given by an analog signal fed to the mic input of the sound card.

Needless to say that starting such a complex project with an original approach is very difficult.
Hence the concept idea sounds already completely dumb from the start...

So the project failed after a year of regular feedback with a "bad news" announcement.

Was it a scam ?

It's fairly difficult to assess that it was clearly a scam from the start.
The number of video posted showing people working and talking about the project lead to think that there was a least an attempt to deliver something.
The very rough demos that can be seens clears tells that the project wasn't going anywhere anyway.
Why not just vanish with money instead of giving regular feedback ? I'm more keen to think that the project was started by somebody who didn't had a clue about how to get from an idea to an industrial project.
After an unexpected Kickstarter success, he'll start working on developing his idea soon to realize, after having blown a big chunk of the money, that it is simply no feasible.
Claiming that the money was stolen was the only sound alternative in order to be able to keep the money from the backers.
For me, Peachy Printer is a failure turn into a scam.

Hardware projects on Kickstarter

I subscribe to the idea that "complex" hardware projects on Kickstarter : robots, 3D printers, video game consoles are scams for multiple reasons:
* Delivering hardware products is extremely hard and requires multiple skills beyond electronic knowledge.
* An idea, if any good, will probably get stolen : it's so easy to copy any electronic devices theses days.
* Being able to deliver a product to a large customer base requires investment that have to be made upfront, which mean that the project is already largely funded and didn't requires Kickstarter.
* If a Kickstarter backer pays for product that he is actually willing to receive at a good price, who is paying for the R&D ?
* Successful Kickstarter hardware projects :
Oculus Rift : backed by investor and eventually bought by Facebook
Pebble : delivered but didn't manage to stay in the business for long
Ouya : delivered but didn't manage to stay in the business for long
These 3 projects delivered on Kickstarter but there were not simple hardware device, the software (drivers/proprietary SDL/proprietary eco-system) was the one true thing that prevent copy.
Furthermore none of them managed to stay in the business as independent company which sums up the harsh difficulty of making and selling hardware devices.
See 10 older news | Back to the top