Disney Research gives a boost to fabrics rendering

Smoothed Aggregation Multigrid for Cloth Simulation

Disney Research found a different mathematical approach to solve the extremely complex equations necessary to render realistic fabrics.

As you can see from the video released with the official paper, the resulting simulation is extremely accurate. Like almost every algorithm, it’s not a generic solution to every cloth rendering problem. It performs better on large problems (more than 25k vertices), stiffer materials and small masses. Also, as stated by the authors, time to solution doesn’t really scale linearly, but there’s room for improvement.

Anyway, it’s a great addition to the 3D rendering algoritms’ world.

 

 

The Moon, today’s maps and the mysterious dark side

Today's maps and the mysterious dark side

There was a time in human history when no living creature had ever seen the hidden face of our beloved satellite.
As you know, earthlings can only admire one side of the Moon. The other side can only be seen from space.
For millennia the laws of physics stood in the way of our quest for knowledge, until one day we learned to use such laws to our advantage.
To go back to that day, we need to set our time machine to an astonishing…

56 years ago

Yep, it was the 26th of October, 1959. A lot of people still alive today was born at a time when we didn’t even know what the other side of the Moon looked like (let alone landing on it).
On that day, as described on the official mission page:

[quote style=”boxed”]The Luna 3 spacecraft returned the first views ever of the far side of the Moon. The first image was taken at 03:30 UT on 7 October at a distance of 63,500 km after Luna 3 had passed the Moon and looked back at the sunlit far side. The last image was taken 40 minutes later from 66,700 km. A total of 29 photographs were taken, covering 70% of the far side. The photographs were very noisy and of low resolution, but many features could be recognized. [/quote]

The Russian spacecraft was equipped with an analog camera, an automated film processing lab, a scanner and a transmitter. Yes, it was cutting edge tech back then.

These instruments produced the first picture of the B side of the Moon. The people who looked at the picture as it was transmitted back to Earth, also happend to be the first humans who actually saw the other side of our satellite.

The original picture was this:

First image of the far side of the Moon
First image of the far side of the Moon

 

The full gallery can be found here.

Today, there’s a little guy up there called Lunar Reconnaissance Orbiter taking better pictures (among many other things). Here’s a nice comparison made by NASA and published on this article.

Side by side comparison
Side by side comparison of the first ever photograph of the lunar far side, from Luna 3, and a visualization of the same view using LRO data.

 

Read more

UCLA physicists map atoms in 3D

Have you ever tried those magical pieces of software that merge multiple pictures of an object from different angles to produce a 3D model of it?

Good. Now think about upgrading your equipment, because those guys at UCLA do the same with atoms. Seriously.

Using a scanning transmission electron microscope at the Lawrence Berkeley National Laboratory’s Molecular Foundry, Miao and his colleagues analyzed a small piece of tungsten, an element used in incandescent light bulbs. As the sample was tilted 62 times, the researchers were able to slowly assemble a 3-D model of 3,769 atoms in the tip of the tungsten sample.

Here’s the final result

The 3-D coordinates of thousands of individual atoms and a point defect in a material were determined with a precision of 19 trillionths of a meter, where the crystallinity of the material is not assumed. The figure shows the measured 3-D atomic positions of a tungsten tip, consisting of nine atomic layers, labelled with crimson (dark red), red, orange, yellow, green, cyan, blue, magenta and purple from layers one (top) to nine (bottom), respectively.
The 3-D coordinates of thousands of individual atoms and a point defect in a material were determined with a precision of 19 trillionths of a meter, where the crystallinity of the material is not assumed. The figure shows the measured 3-D atomic positions of a tungsten tip, consisting of nine atomic layers, labelled with crimson (dark red), red, orange, yellow, green, cyan, blue, magenta and purple from layers one (top) to nine (bottom), respectively.

 

Original article here.

Best images of an exoplanet 63 light years away

Beta Pictoris is a star located 63.4 light years from our solar system. Featuring a big debris disk orbiting around itself, the star is supposed to be 8 to 20 million years old.

On November 21, 2008, the Very Large Telescope revealed the presence of an exoplanet orbiting Beta Pictoris. On that day, this image was published by the European Southern Observatory.

First image of β Pic b
First image of β Pic b

 

The planet, roughly around 10-12 times the mass of Jupiter, was named “Beta pictoris b” (β Pic b). It orbits around its star at a distance of around 9 AU (1 350 000 000 Km).

 

The orbit of β Pic b compared with our Solar System.
The orbit of β Pic b compared with our Solar System.

 

A few days ago, the SETI institute released the video found at the beginning of this post. It depicts the same planet moving through 1 ½ years of its 22-year orbital period.

Here‘s the original article.

As of today, this is the best image of an exoplanet ever taken.

 

Google’s Codebase Insight: great for small businesses?

The 2015 edition of @Scale featured an extremely interesting insight on how Google handles its codebase.

I won’t repeat what is so clearly explained in the video. Instead, I would like to focus on its potential viability for very small companies.

Indeed, sky high numbers like Google’s require a huge effort in tooling, standardization and definition of constraints and procedures. As a result, similar companies willing to unify their codebase would generally consider it unpractical.

However, I believe that a monolithic codebase could be a huge improvement in efficiency and reliability for small software houses, for several reasons.

 

#1: Organization

Small companies tend to have little to no policies in terms of repositories, code quality, conventions, etc. Thus, code improvements, modifications, copies, versions, docs and changelogs have a tendency to scatter around company systems, personal computers and the internet. If there is one repository, then everything can be kept together and under control.

#2: Code reuse

That’s pretty much the same for Google. You write code once and then use it across multiple softwares. However, in small companies code reuse can become a nightmare (see versions fragmentation, unmerged customizations,…). With little to no policies, versions can even be lost on some hard disk placed somewhere. A unified codebase may help keeping things tidy.

#3: Everyone knows everything

In small companies, that’s quite common as well. The team is small and everyone knows what the others are doing, even if they’re working on different projects. In a collaborative environment, one repository facilitates code browsing and shortens times to access snippets and functions. Changes and improvements to shared codes and services can be discussed together and implemented, without requiring huge investments on tools.

#4: You can always detach projects

If things are going well and the company quickly scales up, you always have the option to detach projects from the main repository. This is obviously a choice that can be either great or terribly wrong depending on the context. However, it is always good to know that you have an “emergency button”.

 

CONCLUSION

Small companies that cannot invest in the definition of software developement processes may benefit from a monolithic codebase like Google’s. It helps keeping things organized and accessible while leaving the option to adopt a different strategy when things get more complex. As always, it’s a decision that needs to be evaluated in its context of application, but I believe it’s worth considering.

Toolkit: The Size Of The Universe

The Universe is big…yes, we know that. But the Universe is REALLY big…and we know that as well. And it is so big that you can’t even, omg, we’re insignificant!!1!1! and so on… Since we all have a hard time imagining the immense size of what lies above the thin gas layer that prevents us … Read more

First images from New Horizon’s downlink released!

When I was around 8 years old, I asked my parents to buy a big book called “Planet’s Atlas”. Since then, I’ve always been extremely fascinated by the mysteries of the Universe.
However, I was a bit disappointed by the fact Pluto didn’t have pictures as beautiful as the other planets. For almost 20 years I wished someday I’d be able to see the surface of Pluto as it really is.

Finally, July 14, 2015, my wish became reality. Now the year-long downlink of every bit of data recorded by New Horizons is finally started.
Here’s some of the pictures uploaded by NASA. You can find the related post following this link.

Thank you NASA for making my wish come true!