Hannah Clare Wray Hazi

I love making beautiful things.

Mar 13, 2017

Additive Manufacturing Is Not Going To Save The World (Sorry)

What are we talking about here?

Additive manufacturing, more popularly known as 3D printing, isn't just one thing. It describes a vast range of technologies. What they have in common is that they use repeated computer controlled deposition or binding, layer by layer, of materials to build up a product. The additive technologies have improved to the point where some can be used to produce final parts for end users as well as prototypes and ‘looks-like’ models. Real customers have been found for 3D printed products, especially in the medical sector. Hearing aid shells, dental braces, shoe inserts and other low volume, complex, bespoke products are places additive manufacture has an advantage over conventional production as a final production technique. As an enthusiast for additive manufacturing that's really good to see. But there's still a lot of hype and misunderstanding around the technologies. I'd like to clear that up here, in what will (unashamedly) be a bit of a long and ranty post. So strap in!

3D printing processes

Additive manufacturing technologies have existed for decades, but the recent development of smaller, more accessible machines has drawn media attention. The vision of a world of distributed manufacture, where the means of production sit on every desktop, is compelling. The technology has been hyped by commentators as the manufacturing tool of the future – something that will “transform our lives”, “revolutionise the global economy” and “change the future of industry”. Enthusiasts advocating for the power of additive manufacturing tend to say things like

"I don’t pretend to be impartial. This is a future I believe in, and I’m one of many working to build it." - Chris Anderson, Makers: The New Industrial Revolution

"3D printing will change the way things are produced more in this century than the industrial revolution did over the last 300 years... Industrial 3D printing will forever change the world as we know it. Are you ready?" - Rick Smith, Forbes

I used to be one of these very excited people. To a certain extent, I still am - there are some wonderful things that only additive manufacturing can achieve, and it really is transforming certain industries. For example, the hearing aid business has undergone a quiet revolution. All hearing aid shell manufacturers now offer personalised shells made by additive processes such as SLA and SLS. Dental implants aren't far behind.

Gartner's Hype Cycle

However, the fact is that additive manufacturing is not commercially viable for producing larger numbers of products. For production runs of over around 500, conventional production technologies are far more practical. Additive manufacturing is currently expensive, slow, requires more design time than commonly imagined and constrains production to small batches. Gartner’s Hype Cycle for 3D Printing predicts that both 'Consumer 3D printing' and '3D printing in Manufacturing Operations' will soon come out of the ‘Peak of Inflated Expectations’ and enter the ‘Trough of Disillusionment’ soon.

How far have we gotten toward World Domination?

At the desktop level

There have been game-changing reductions in the cost of less sophisticated machines. This has been driven by hobbyists and open-source developers such as the RepRap project led by Dr Adrian Bowyer. RepRapPro recently became a victim of its own success, closing for business in January and releasing a statement that

"“The market for low-cost 3D printers is now so crowded and so competitive that a small specialist company like ours cannot expand. So, because we are not bankrupt and we do not have any debts to pay, we have chosen to stop now while we are ahead and to concentrate on other activities.” - RepRapPro

A message from Dr Bowyer on the RepRap.org forums explains further that

“this great flowering of small companies (all essentially based on the RepRap Project) making commercial life difficult for each other was pretty-much what I predicted when I started the Project… But we expected it to take decades, not four years.”

A typical small desktop FDM machine now costs in the region of hundreds of pounds. Profit margins on these hobbyist machines have eroded and no further large drops in price may be reasonably expected.

Desktop printers were created that would work ‘out of the box’ without the need for great additive knowledge and DIY skills. Firms like MakerBot and Ultimaker targeted those who want to access the technology but do not have the skills or time to build and maintain their own printers. However, this new generation of machines is still not reliable. MakerBot took an ‘Apple’ approach to development and hid the complexity of their product from users, closing the source code and preventing user modifications. Sadly, desktop printers are not yet reliable enough to be operated by unskilled users without significant risks of failure. MakerBot is now facing a class action suit about failures of its ‘Smart Extruder’ and has been called a “dead company walking” after laying off its manufacturing staff, closing its retail locations and outsourcing all production to China. (Update: the class action lawsuit was dismissed but 3dPrintIndustry is still calling the company "a shadow of its former self").

"Anyone who has ever been to a hackerspace has seen a MakerBot printer, but that printer was broken." - Brian Benchoff, HackaDay

Whether using a pre-assembled machine or a kit made from open-source parts, what can be printed with a desktop FDM (fused deposition modelling) machine is mostly limited to plastic trinkets that “looks like I bought it in a panic at a jumble sale for 10p” as a journalist observed. A typical quote from a ‘Maker’ I interviewed at Newcastle MakerSpace was “I do spend more time trying to fix the printer than printing things”. An insightful member of the ‘Maker’ community Dominic Morrow pointed out in an interview with me for my Masters project a couple years back

“In our community – the maker community and hackspace community – 3D printing isn’t about 3D printing, it’s about building a 3D printer.”

I've experienced this myself and found it quite frustrating. While I generally focused on making things and found the process frustrating on most machines, for many 'makers' the quality of the output is considered secondary to the pleasure of building and tinkering with the printers themselves.

At the industrial level

Prototyping houses such as Shapeways, Sculpteo and iMaterialise have succeeded in popularising the larger, more professional additive technologies. They fill a niche by supplying the ability to print with expensive methods such as DMLS (Direct Metal Laser Sintering), which is not possible as an individual because the machines are so expensive - millions of pounds. However these companies don’t offer anything fundamentally new to manufacturing except for their business model. Their machines can produce higher quality results than the desktop machines and at a larger scale. But they are still slow compared to conventional processes such as injection moulding or milling and they require skilled operation.

Sculpteo released a white paper on the batch size problem recently. For 5 representative parts they compared costs quoted by injection moulding companies Sinomould, Quickpart and Protomould to their own print-costing algorithm for SLS. Injection moulding is a conventonal manufacturing process which involves injecting heated plastic under pressure into a mould. Its costs include start-up costs for tooling, so this process is more expensive than additive manfuacturing at first. But as the number of parts produced increased, injection moulding catches up due to its lower cost per additional part.

Remote control from the Sculpteo study

The average break-even point was 436 units. After that, injection-moulding was less expensive per part.

“Should the surface finish and material properties of the 3D printed part serve the needs of the desired application, then 3D printing remains an economical manufacturing method for up to 500 unit production runs (dependent on unit size)." - Sculpteo

And bear in mind that Sculpteo had every incentive to make sure this figure was as good as possible! As an additive manufacturing bureau they naturally want to portray their technologies favourably.

Better software has also improved printing results; improvements in slicing engines, better design of support materials, semi-automatic design and placement of struts all contribute toward higher quality prints. Bedrich Benes, an academic at Purdue University, and Radomir Mĕch, from Adobe Systems worked on development of software to improve CAD files. TMĕch explained to me during an interview for my Masters project

“So there are basically two stages. One is that you can detect structural issues. And this can be found somewhere on the market. And the second thing is that you solve them. And you can solve them during the design phase, when you are creating the 3D object. [Or] you can solve them post when the object [is] created.”

Their work has been very useful to the professional 3D printing community in reducing the costly cases where an unprintable part is sent out to be printed. The software they developed can do things like increase the width of a fragile neck on a sculpture, to improve the odds that it will print successfully.

Software can also be useful for lightweighting objects to be 3D printed. ‘Inspire’ software from solidThinking generates optimal organic shapes and struts from given envelopes, minimising material usage and maximising strength in the printed objects. Their case studies showcase the software's ability to reduce weight by 50% or more.

""We are very satisfied with the results achieved. We have realized better designs and could fulfill all structural requirements in a shorter term." - Juan Manuel Romero, Alstom, solidThinking case study.

Although in the Alstom case study time to manufacture was shortened, this was because the iterative cycle of design optimisation was shortened – the actual additive printing process usually still takes longer than conventional casting or milling. As pointed out by ProtoLabs:

“The process of melting metal one ultra thin layer at a time also isn’t terribly fast — our instruments may take a few days to build. For many parts, CNC machining remains the most economical choice.” - Protolabs, Manufacturing Design Tips.

Plunkett Associates, who deal in conventional and additive metal prototyping, explain

“There are limitations to be aware of, slow build speed, restriction of build volumes… support structures are required…these can be difficult to remove.”

This is reflected in the timescales – DMLS takes 2 weeks for Plunkett to turn around, sand casting just one week. Software solves important structural design problems unique to additive manufacturing, but has not enabled any radical improvements in speed.

Fundamental Limits

Speed

Time taken to create an object is proportional to the number of printed layers. This is a more powerful effect than x and y movements in additive technologies. Whether it is a swipe and refresh stroke to lay fresh powder down, the movement of an object in a resin bath or the motion of an FDM extruder, z motion is the most crucial. As Joseph DeSimone says “There are some mushrooms that grow faster than 3D printed parts.”. To print parts quickly, they must be thin, with few layers - not very '3D' at all.

Some like Rob Winker of Stratays will argue that this doesn’t matter much if you are accustomed to leaving the printer running overnight. But this approach fundamentally limits additive manufacturing to small batch production. Imagine being told you could only produce car body panels by leaving the machine running slowly overnight!

Print head speed and acceleration restricts the speed at which the printer can produce each layer. This is a somewhat artificial limit – one can imagine a printer where the head(s) are stationary and a belt is swept underneath at high speed. In DMLS and SLA (selective laser sintering) the problem turns into a raster scanning one – how fast a laser may sweep across a bed of powder or a tank of resin while laying out a pattern.

Fundamental speed restrictions are dictated by the physical properties of print materials that must melt and solidify. Take the example of depositing ABS using a heated extruder nozzle. An elegant study by Stuart Oliver determined the speed limits on PLA for the Ultimaker extruder to be around 8 – 10 mm3/sec for a 0.4mm diameter nozzle. After that point

“the constant excessive push of plastic into the hot end raises the pressure in the molten plastic, making it more likely that the molten plastic, unable to escape the way it is supposed to go, will instead find its way back up … and form a jam”.

This is not only a plastics issue. Currently, “steel powder is quite slow” and “expensive equipment (is) needed for essential post-processing” according to the textbook 'Developments in Rapid Casting'.

Expense

There is a fundamental disconnect between price of hobbyist desktop equipment and the price of professional machines. No hobbyist could to afford to buy or run a metal sintering machine. Print bureaux give more access to the technology, but at a high price of £25+ typically for a small metal part. Regardless of machine, there is little economy of scale in additive manufacture. If an increase in production volumes is needed, manufacturers naturally turn to conventional processes. “Generally the material is expensive to buy, and the process is slow” advises print service bureau 3D Print UK.

There has been enlightening comparison of desktop 3D printers to the bread making machine fad of the '90s – appealing as a novel concept, but not about to replace conventional mass production methods due to cost and inconvenience.

Cost of materials is very high, typically around £35/kg for high quality plastic filaments, and ten times that or more for resins and powders. Some of these materials will inevitably be wasted due to swipe and refresh mechanism operations, support material printing and resin exposure to light. This raises product costs further. Compare this to the raw material prices when purchasing even high quality engineering plastics in bulk and the enduring appeal of conventional methods becomes clear.

Material cost comparison Note this is a logarithmic scale.

“Sadly for every request we have for a full sized Daft Punk helmet, there’s an equal number of disappointed Daft Punk fans out there, when they find out how much it will cost to build” - Nick Allen, Why 3D Printing is Overhyped (I Should Know, I Do It For A Living).

Materials

In general, chemistry must be precisely understood and controlled – particularly as it relates to surface wetting since this controls the height and resolution of layers and interlayer bonding in droplet based printing. High wetting tends to give thinner layers with a stronger bond.

Materials for additive manufacture are peculiarly vulnerable to changes in the environment that have a knock-on effect on print quality. Humidity, ambient temperature, reel storage time and conditions all affect results for plastic prints. Excess humidity causes water absorbent filaments such as nylon to noticeably swell and PLA filament to become more brittle and degrade.

New materials are being developed continually for additive manufacturing. The range now includes ceramics, elastomers, composites and waxes. These materials do not deliver any changes to the speed of manufacture however, and each material has its own unique challenges in printing.

Maintenance and reliability

OEE, Overall Equipment Effectiveness, is used by manufacturing engineers as a measure of the percentage of time equipment is used for making useful products. “World class” OEE for machining centres and other non-continuous processes is about 85%. Continuous, 24 hour/day processes like the extrusion of pipes or beams can achieve 90% OEE or greater. Additive manufacturing tends to fall far beneath this, though it is difficult to gather exact data due to the secrecy of the main players. Additive manufacturing machines are most suited to custom production runs and small order quantities, which brings down OEE metrics further. Due to these restrictions, few final parts are made using additive techniques. Wohlers’ surveys indicate less than a quarter of additive sales are of functional parts. Their data from 2011 indicates around 13% of sales are direct part production. Additive production is still dominated by research and prototyping.

Surface finish and post-processing

A common myth about additive manufacturing is that parts require no assembly or post-processing. In fact, extensive finishing is required to give printed models an appearance comparable to moulded parts. These finishing techniques are considered a 'dark art'.

Squirrel finished with acetone smoothing technique Squirrel models finished with actone vapour smoothing by the fine folks at the RepRap blog.

As Matt Griffin explains,

“Makers who have mastered finishing techniques are granted wizard status by fellow 3D practitioners.”

Examples of surface artefacts from additive processes include white marks which can appear after support material is removed, ‘stair stepping’ marks and grainy surfaces. Stair stepping marks are more noticeable on shallower curved surfaces and are “a natural artifact of 3D printing” which cannot be avoided. Parts must often be smoothed, varnished or coloured after printing and will rarely achieve a mirror finish. These problems are not restricted to amateur users – all reputable print bureaux have a disclaimer section about finishing. 3D Print UK point out that these disadvantages are “either ignored or misrepresented by many sources on the internet and elsewhere”

Mechanical properties, accuracy and design

Layerwise construction leads to mechanical weaknesses. These are inherent to the production process and difficult to avoid entirely. The orientation of a part on the build plate has a large and unpredictable impact on its strength and function. In the z direction parts tend to be weaker and can sometimes even be broken apart with bare hands. Thus, manufacturing parts such as springs with demanding mechanical properties additively is difficult.

Accuracy can be a problem – users cannot simply print from a CAD file and expect the dimensions to match up, especially with desktop machines. In theory, printing complete mechanisms such as gearboxes in one piece is possible. In reality, it is often too difficult to be worthwhile even for a professional workshop. 3D Print UK say about SLS

“It is always better to manufacture the parts individually and then assemble them after. Even better, buy off the shelf gears from a supplier and 3D print the box that holds them all together.”

Designers must also account for warping, shrinkage, and laboriously troubleshoot failures. Spencer Wright laments

“DMLS is anything but plug-and-play. Even when a design has been optimized specifically for the process, it often takes dozens of tries before a functional part comes out of the printer. And the process of troubleshooting a failed build — even at the most advanced DMLS shops in the world — still involves a lot of trial and error.”

Limits to the shapes that can be printed

The hype leads many to think that additive manufacturing can make any product. This is incorrect. Mastery of the limitations of the technologies can take a lifetime. Designers must use their knowledge of additive processes to craft appropriate parts, just as they have learned to work with the limitations of each conventional process. The spectrum of difficulty is not linear– similar products can be vastly different to manufacture.

  • 2+D printing is relatively straightforward. It restricts model designs to prisms and pyramids. It can be done with less viscous materials such as food substances that tend to spread on printing.

  • Overhangs are more difficult. The critical angle of collapse is different for each material. Holes and hollow structures may be created using overhangs, though not all hollow structures are possible.

  • Bridging across unsupported gaps is the most difficult, yet it is needed to create versatile 3D structures with all the benefits consumers are familiar with. Some materials cannot be bridged and most can only bridge a limited gap length. Some techniques and materials allow unlimited bridging due to inherent support material presence, for instance sintering, but sometimes it cannot be done at all.

Depending on materials, some things are just not possible with current technologies: large parts, very thin wall sections, wires, or completely enclosed parts with no escape holes in powder sintered materials. “For instance, a Klein bottle could be printed in metal – but no matter how you oriented it, there would likely always be support structures stuck inside its fat end.” Interlocking parts or pre-assembled structures are one of the most hyped things about additive manufacture, yet these are not possible in many materials either. Shapeways explain that in steel they cannot produce interlocking or enclosed parts because “While the product is being transferred from the printer to the infusion chamber, it exists in a delicate "green state" which does not support interlocking parts.”

Looking to the Future

It's exciting to think of a line speed, fully additive manufacturing process overcoming these limitations. There are some impressive-looking technologies and products out there that seem to demonstrate the great potential of additive manufacture at true line speed. I'll examine each in turn to determine whether they are truly revolutionary advances, or only incremental ones.

Faster FDM printers?

Mini Kossel

Delta printers such as the Mini Kossel are a recent development in the FDM arena. These typically offer speeds of up to 320mm/s, compared to 100mm/sec for an average Cartesian FDM set-up. This is because of the way their heads move - instead of x-y-z Cartesian motion the print head is held on 3 tilting arms from above, which requires more processing power from the printer controller to work out motion paths. This is definitely useful but not a game-changing improvement in speed – and begins to run into the fundamental material limits on laying down plastic.

Software improvements are likely to deliver better bed space optimisation, automatic strut placement and strengthening and better processing of CAD files. Hopefully we'll see even better expert software systems to optimise parts for the limitations of additive manufacturing. However, even this would just allow the factors already understood by expert designers to be taken advantage of by ordinary users.

High throughput extrusion heads such as the Volcano from E3D Online may improve print speed for FDM plastics – pushing the bottleneck back to the acceleration the print-head is capable of. “By simultaneously increasing the amount of plastic the hotend can process per second and also increasing the maximum layer height we can gain drastic reductions in print time.” Typical reductions are around 40% compared with a standard 0.4mm hotend. However, there is a trade-off between speed and resolution.

New technology?

Intelligent Liquid Interface technology from NewPro3D and Continuous Liquid Interface Production from Carbon3D are very similar technologies which claim speeds of up to 25 – 100 times faster than conventional stero-lithography. These companies are currently the “hottest startups to come along in the emerging 3-D printing industry” and have already attracted much venture capital funding.

Their (similar) technologies are indeed very promising. The underlying concept is similar to standard stereo-lithography. The operating principle is pulling the printed object out of a bath of liquid that is shielded from the UV cure by a selectively permeable membrane. Oxygen flux through the ‘build window’ is controlled. Oxygen inhibits the cure and UV enables it. This can potentially achieve smooth, ‘layer-less’ construction and very tall parts, fast. Figures from NewPro3D give 4.5 minutes for a standard hollow ball. In comparison the same product will take 180 minutes with Polyjet, 690 minutes with SLA.

This is far faster than any other additive technology, but is it game-changing? There is still a curing time limitation and some concern about the physical properties of the finished resin parts. It is likely the parts will degrade in sunlight as normal SLA parts tend to. Carbon3D has the most compelling data to back up its material development and assuage these concerns. But they admit only a few materials will be eligible for their techniques. There are also valid concerns about vibrations and dust contamination of the feed pool. As experienced analysts pointed out

“3D printer breakthroughs are like battery breakthroughs and cancer cures: there is a new one every week, and yet, remarkably, very few ever make it to market. Of course, progress is being made although a lot of that is much slower than you would imagine” - Is the new Carbon3D Printing Technology a Breakthrough or Just Hype?

Reducing materials costs?

Filament "factory" You too could spend years recreating a piece of industrial plastic handling equipment in your own home!

Locally extruding filament could reduce the large cost of materials for FDM prints and help the environment by recycling plastic. One study says “widespread adoption of in-home recycling of post-consumer plastic represents a novel path to a future of distributed manufacturing” However, this is not straightforward. Any injection moulding engineer will tell you that controlling the quality of plastic feedstock can be very challenging. It is not as simple as regrinding old prints or milk bottles and using them to feed a fresh filament, as suggested in the media. There will be problems with plastic degradation, producing consistently sized filaments, incompatibility of different types of plastics and pigments. Consumers would need to become plastics experts in their own homes. Typical user comments:

“Drop the idea now… We extruded some filament. NONE of it is usable... And if you think you have problems printing now – just add wonky self-made filament to the mix.”

Greater co-operation?

As an industry, additive manufacturing as a production technique is still trapped in a dark age of trade secrets and low collaboration. As Spencer Wright says

“a part’s layer boundaries reveal its build orientation, and even with careful clean-up it’s generally possible to tell which surfaces have had support structures removed from them. In short, manufacturing forensics is, with enough experience and care, fairly reliable. And yet orientation and support structure setups are almost always treated as closely guarded secrets.”

No company can overcome all of the additive limitations, but progress may become faster with greater knowledge sharing, following the example of the open-source movements. Indeed, much of the progress on desktop machines was driven by open sharing of information by hobbyists worldwide and the efforts of public-spirited organisations like the Rep Rap Project, which has released all of its designs under the GNU Public License. Perhaps the same will begin to happen with the larger, more industrial machines as well.

Where do we go from here?

Even if there is no great breakthrough in speed over the next few years, there will continue to be promising improvements. Meanwhile, consumers are clamouring for the advantages of additive products, so what can manufacturers do? They should take a measured approach - using the advantages additive technologies can offer, if appropriate to their business, while remaining realistic about the limitations and peculiarities of each process.

Sheer output has never been the strong point of additive manufacturing. It will never beat conventional techniques for a given object in mass production. When producing cans en masse, manufacturers will always benefit from a specifically-designed can-making machine. Additive manufacture is like a Swiss Army Knife – useful for many tasks, but not better for any one task better than the specific tool designed for that task. It gives the advantages of customisation and shorter lead times for individual parts, which leads to ease of prototyping. But it can’t replace the power-tools of conventional mass production for larger scale manufacture.

Jack of All Trades, master of none Jack of all trades, master of none.

Consumers, designers and engineers must be educated out of the idea that additive manufacturing will solve everything. It is a powerful design and prototyping tool with some limited production applications. That is all it is. The question to bear in mind is, could the part be made more easily another way? Additive manufacture is seen as ‘sexy’ and is an accessible way to start making things as a beginner (especially with desktop machines). This is often an issue for start-ups and designers who have little experience in the workshop. They tend to focus on additive manufacture as a technology that they can understand. They can use it in producing prototypes and improving their designs. It is intuitive, safe and accessible for beginners familiar with CAD tools. A desktop printer can find a place in a design house where a milling machine might not. However, when it comes to scaling up production designers may fail to progress to the next stage and look at other more suitable production processes such as plastic moulding, die stamping or turning on a lathe.

“Thing is, I’m quite versatile in using bandsaws or mills or things like that. If I make it by hand, I make it properly. But if they can use them (3D printers) they will use them,” - James Beeby, interviewee.

‘MakeSpaces’ and 'HackerSpaces' (like our own one in Cambridge) can help to solve this problem by exposing designers to conventional workshop tools and making them aware of the limitations of additive manufacturing alone. Better education in schools and universities is also crucial, and must wherever possible allow students to get hands-on experience of other tools. Lastly, prototyping houses must do their part to educate the users of their services. A good example of this approach is Shapeways recently publicising Dominik Sippel’s research on the limitations of their EOS SLS technology.

'Cheating'

FoodJet's icing dispense technology Example of FoodJet's icing decoration capability

We don't actually need full 3D capability in order to take advantage of additive manufacturing technologies in production. ‘Cheating’ using only 2D+ printing is an entirely valid approach if it delivers what the consumer wants. Many successful ‘3D printing’ companies actually only provide a 2D+ late stage customisation service. For instance, Boomf exploits the high speed of full colour edible 2D printing onto ready-made conventionally produced marshmallows. FoodJet brings 2D+ printing to the food industry with computer controlled graphical decorating, cavity filling and shape depositing. The limit in height means its printers can be integrated onto full-speed production lines. Selective potting and sealing with resins in the electronics industry is another example of successful 2D+ printing.

Co-Design

Makie Dolls Makie dolls made through customiser app

Co-design allows customisation within achievable limits and hides inherent compromises from customers. A process which allows users to make some design choices within a constrained space of options. This means people can customise a 3D printed object, but don't have to worry about ending up with something that is unprintable. Companies in this space include MakieLab, whose popular, “free and fun app” allowed customers to design their own dolls “without you having to have any knowledge of 3D modelling or the manufacturing process”, the Music Drop customisable music box and Shapeways' Ring Customiser.

Using additive manufacturing to create just part of the form of the finished product is also a valid approach that takes advantages from both spaces. For instance, Candy Mechanics has made ‘3D printed’ chocolate lollipops in the shapes of people's faces. These use additively manufactured mould templates produced using desktop FDM. Vacuum forming creates the actual mould. The potential to use additive moulds in injection moulding is very exciting – creating custom printed injection moulding tools for lower volume runs or custom production will bridge the gap between personalised batch production and mass production reducing costs and time-to-market. This is already being used for medical device development, which requires many design iterations of small runs of mouldings.

Medical part mould created additively Medical mouldings already use this technique

A more realistic approach

Klaus Højbjerre at the AM department of the Danish Technological Institute observes

“While some people are obviously blinded by the hype, others are getting their hands dirty working with the technology right now. Not as the grand replacement of every existing manufacturing technology, but rather as complementary technology opening up new possibilities and markets.”

These areas, not the hyped idea of fully replacing all high-volume production with additive techniques, are where the most exciting and realisable advances will take place over the next few years. Over-enthusiasm about additive manufacture obscures these facts and leads to disillusionment and a lack of real progress. By keeping our feet on the ground and maintaining a clear understand of what is suitable for 3D printing and what isn't, we can really get somewhere.

Happy printing!

Feb 13, 2017

Monkey Patching Reprise

It turns out that things aren't as simple as they seemed in my last post. A couple of people have pointed out to me that Python can indeed do monkey patching - so what's the difference between this and Ruby? Am I just making a fuss over nothing? First of all, to some proper definitions.

What Even Is Monkey Patching?

It's also known as guerilla (/gorilla) patching, hot-fixing, and more recently, 'duck-punching'.

Geoffrey: Now, you went to PyCon a couple months ago. And it’s well-known that in the Python world, they frown on monkey-patching. Do you think they would think more positively of duck-punching?

Adam: No, I think they will continue to look down on us, no matter how awesome and hilarious we become.

Voice In Background: Isn’t that the truth.

Geoffrey: I also have Patrick Ewing. Is this a good idea, and will it catch on?

Patrick Ewing: Well, I was just totally sold by Adam, the idea being that if it walks like a duck and talks like a duck, it’s a duck, right? So if this duck is not giving you the noise that you want, you’ve got to just punch that duck until it returns what you expect.

Transcript from Geoffrey Grosenbach podcast at RailsConf2007

Monkey patching can mean some subtly different things:

  • Changing a class's methods at runtime
  • Changing a class's methods at runtime and making all the instances of that class change after the fact

As pointed out in this thread on StackOverFlow by Dan Lenski, both variants are indeed possible with Python. Here's an example:

class Widget:
    def __init__(self):
       pass
    def who_am_i(self):
       print("I'm a widget")

>>> my_widget = Widget()
>>> my_widget
<Widget object at 0x7f6b5aa52e80>
>>> my_widget.who_am_i()
I'm a widget
>>> def teapot(self):
...     print("I'm a little teapot")
...
>>> Widget.who_am_i = teapot
>>> my_widget.who_am_i()
I'm a little teapot
>>> new_widget = Widget()
>>> new_widget.who_am_i()
I'm a little teapot

And Python doesn't warn you either! So perhaps I was unfairly harsh to Ruby? Not quite.

One Important Difference

Unlike with Ruby, we can't monkeypatch the basic built-in classes, such as int, float or str. This is because they are defined in C extension modules which are immutable. These modules are shared between multiple interpreters and made immutable for efficiency (and safety)'s sake. So when you try to change the behaviour of the built-ins, you can't -

def own_up(self, a_string):
    return a_string.length()*"!"

>>>my_string = "Hello, World!"
>>> my_string.upper()
'HELLO, WORLD!'
>>> str.upper = own_up
Traceback (most recent call last):
  File "<input>", line 1, in <module>
TypeError: can't set attributes of built-in/extension type 'str'

To do something which approximates the effects of monkeypatching a built-in we have to subclass, like so - this example lets us write a custom instance of the str.upper() method by inheriting from str.

class CustomString(str):
    def upper(self):
        desired_value = "!" * len(self)
        return CustomString(desired_value)

>>> custom = CustomString("hello world")
>>> custom
'hello world'
>>> custom.upper()
'!!!!!!!!!!!'

Ta-da! I still maintain that this is saner behaviour than Ruby.

(Sidenote - Remember, for str and any class based on it, Python strings are immutable - when we call methods on a string, we just get a new return value that we have to assign to some other string object to store.)

If Python and Ruby do it in similar ways, is Monkey Patching even All That Bad, then?

Yes! I still maintain that monkey patching is dangerous and shouldn't be used often. I think it says something positive about Python that it's not something you need to know intimately in order to program competently in the language, and indeed most Python programmers I know feel a bit iffy about it. It's a little off-putting (to say the least) to realise how much of a way of life it is for Ruby programmers.

But don't just take my word for it - have a read of this insightful blog post by Avdi Grimm, an experienced Rubyist worried about the impact of thoughtless 'hip' monkey patching - a choice quote:

Where I work, we are already seeing subtle, difficult-to-debug problems crop up as the result of monkey patching in plugins. Patches interact in unpredictable, combinatoric ways. And by their nature, bugs caused by monkey patches are more difficult to track down than those introduced by more traditional classes and methods. As just one example: on one project, it was a known caveat that we could not rely on class inheritable attributes as provided by ActiveSupport. No one knew why.

A fun (scary) read.

Temporary Monkey Patching in Python for testing

As I was looking around at this stuff, I discovered there are a couple of libraries, unittest.mock.patch and pytest monkeypatch designed to allow temporary monkeypatching for testing - when the function or statement using it exits, the patch disappears. I can see how useful it would be to set up mocks in this controlled way, rather than having to worry about it affecting all of your tests. It was deemed so handy that mock is now part of the standard library in Python3. Time to investigate further!

Feb 02, 2017

First Steps Exploring Ruby

Last night I went to the first of the Cambridge Programmers' Study Group new sessions - we're starting to work through the book 'Understanding Computation: From Simple Machines to Impossible Programs' by Tom Stuart. The book uses practical exercises in Ruby to explain a bunch of theoretical computer science stuff, working up to cellular automata, Turing machines and the lambda calculus (exciting!). As Stuart's blurb explains, the book is aimed at

An audience of working programmers without assuming any academic background.

So, people like me. It sounded like exactly the sort of thing I should be learning, since I don't have a Computer Science background.

One small problem - very few of us in the group actually knew any Ruby! So we spent this session trying to teach ourselves using the examples in the first chapter. It was fun trying to pick up a new language in the company of other people - a lot more companionable than noodling around on your own, and easier to find out cute things about the language because if one person discovered something cool, they shared it with the group. This post will be about some of my first impressions of Ruby.

Getting Ruby set up

For those like myself who prefer more instructions than 'download Ruby online!' here's how I did it in the end: ` $ sudo apt-get install ruby-full which on my Ubuntu system downloaded Ruby 2.3, which was just fine for my needs. If you use Windows or Mac I'm told the instructions on the Ruby-lang website are quite helpful and up-to-date. I first tried using the default terminal, irb, which you call from the command line

user@device:~$ irb
irb(main):001:0> 1+2
=> 3

This works fine for the first few exercises, but I couldn't get auto-indentation to work, important for writing multi-line programs! I tried tinkering around with adding a .irbrc file with require 'irb/completion' as recommended in the comments of the main irb.rb file (which I found in /usr/lib/ruby/2.3.0). This didn't help - perhaps I put the config file in the wrong place? I ended up just using a slightly different terminal, pry, which was recommended by a colleague at the Meetup. It does auto-indentation and colour-coding too, which was very helpful! I don't think I'll be going back to irb. To install that, I just did $ gem install pry (gem is Ruby's package manager). To start using it, just type pry in the command line.

Ruby is like Python

Ruby is similar to Python in a lot of ways - this almost hindered me as I kept getting confused between Python and Ruby syntax. It felt like trying to learn Spanish when I already knew some French - confusingly close but distinct. I think ultimately this is helpful (I was mentally translating some things to Python to understand them, and that worked quite well) but to start with it was rather confusing.

Some similarities: yield sort of works the same

>> def do_thrice
*   yield
*   yield
*   yield
* end
=> :do_thrice
>> do_thrice { puts "Hello world!" }
Hello world!
Hello world!
Hello world!
=> nil

And the string formatting works rather like Python 3.6's new f-strings. So for instance, in Ruby "#{x*2} and also #{y+1}" gives you a string that calculates & inserts those values.

The * operator works in similar ways to unpack elements of an iterable - for instance, you can do fun things like:

>> a, *b, c = [1, 2, 3, 4]
=> [1, 2, 3, 4]
>> b
=> [2, 3]

Ruby is not like Python

Objects! Objects everywhere!

(Almost) everything is an object in Ruby, and it made me realise how I usually write Python in a fairly un-Object Oriented style. I generally avoid writing classes if I can help it (especially classes which only contain two methods, one of which is init but this isn't really possible in Ruby. I suppose it's good for my soul - having a language that forces you into object-oriented thinking must make you confront what that really means. That's why there are these little => nil statements scattered around my code examples, by the way - irb and pry always shows you what object your statements evaluate to, and often that's nil, which is the object representing nothing - the 'empty' object, if you're mathematically minded.

I guess I'm going to have to go read those books on Design Patterns after all...

It's super-terse

If Python cares a lot about preventing programmers having to type unecessary characters (goodbye ; I didn't miss you) then Ruby cares even more. There are a lot of things you can do in one line. Here's a good one:

>> (1..10).select { |num| num.even? }
=> [2, 4, 6, 8, 10]

This creates a range which spans the numbers 1 to 10, then looks at all the numbers in that range and, if they are even, adds them to a list and returns the list. Phew!

At times, this can be confusing - for instance, there's no need to write explicit return statements if you don't want to. By default, Ruby returns the evaluation of the last statement in the body of a method. For instance,

>> def what_do_i_return
*   "A"
*   "B"
*   "or C?"
* end  
=> :what_do_i_return
>> what_do_i_return
=> "or C?"

You can write explicit return statements, but you don't have to. And the way to get a function to evaluate to sort-of-nothing is to explicitly return nil (of course!). This returning-by-default is confusing at first, but it seems to fit with the rest of Ruby's style.

'nil' isn't really nothing

You can put nil into things, which is hella confusing. For instance, suppose you have an array, you can add nil elements to it:

>> my_array = ["a", "b", "c"]
=> ["a", "b", "c"]
>> my_array[4]
=> nil
>> my_array.push("d")
=> ["a", "b", "c", "d"]
>> my_array.push(nil)
=> ["a", "b", "c", "d", nil]
>> my_array[10] = "surprising behaviour"
=> "surprising behaviour"
>> my_array
=> ["a", "b", "c", "d", nil, nil, nil, nil, nil, nil, "surprising behaviour"]

Assigning to an element beyond the length of the array inserts nils until it reaches that element index (which didn't exist before!) and then it bungs in the thing you assigned. Intriguing!

Some gripes

'Monkey patching'

Monkey patching is dangerous. It's the way Ruby programmers refer to the ability to dynamically modify classes 'live' - at any time. You can even do this with Ruby's built-in classes, like String. At first glance it sounds really cool - you can just go define a new class method whenever you like! For instance

>>class String
*    def shouty
*        upcase + "!!!"
*    end
* end

adds a new 'shouty' method to strings. So now any string can use .shouty, just as if it were a built-in method

>> "hello".shouty
=> "HELLO!!!"

But here's the thing - monkey patching can break things really easily - and Ruby won't warn you about it! One of the guys sitting next to me at the meet-up managed to break his irb's Array class by redefining some of its built-in methods. Especially if you're new to Ruby and don't know all the built-in class names, it's really easy to accidentally monkey patch your way into disaster. It seems like a really powerful feature, but I'm not happy with how it seems to work. Perhaps with a nice Ruby IDE that reminds you about all the classes you have on-the-go, it's easier to avoid horrible mistakes?

CONSTANTS - why even bother?

'Constants' are sort-of a thing in Ruby. If you define a variable with CAPITAL LETTERS, the Ruby interpreter will give you a warning. But it won't do anything about the warning - it'll still change the 'constant' value, and just complain about it. Here's an example:

>> CONST = "Don't change me please"
=> "Don't change me please"
>> CONST = "I changed you anyways!!"
(pry):224: warning: already initialized constant CONST
(pry):223: warning: previous definition of CONST was here
=> "I changed you anyways!!"

What's the point of having this feature? Can we perhaps write larger programs that will catch this warning and not change the constant (well, and change it back, since we warn but change things anyway)? Basically not at all, apparently! I'm happy to be enlightened by some Ruby-god but as far as I can tell, this feature does nothing except confuse people used to programming in languages where constants exist and are, well, constant.

Overall Impressions

I found that during the session I fluctuated between delight ("what a nice way to do that!") and terror/annoyance as things behaved in unexpected ways. It has some peculiarities that particularly stand out to me in comparison with Python, and some features that are just adorably cute. I've been noodling around with it on and off all day today, and still enjoying getting to know it. Overall I'd say it's been fun! I can't say that I'm dying to use it as my new scripting language, but I'll certainly be pleased to become better acquainted as we work our way through the Computation book.

PS

Thanks again to RedGate for hosting the session and feeding pizza to 20+ hungry programmers!

Jan 25, 2017

Be a Coding Professional

I was looking back through my mindmaps and came across this one, which I made when I first read "The Clean Coder" by 'Uncle' Bob Martin. I remember the book fondly and thought it might be fun to do a proper review.

My mindmap of The Clean Coder

Overall Impressions

The Clean Coder is a great read - Martin is a legend for a reason and his writing style is compelling, fun and readable. As the subtitle of my mindmap says, it's great! The book surprised me because it's not really about the mechanics of writing good code - it's all about professionalism (the 'clean' part) and having a set of ethics as someone who works in software. Martin uses some funny and horrifying war stories from his experiences as a novice programmer-for-hire to flesh out the book and explain what can happen when you don't have robust professionalism in your practice. I found myself nodding along as he described the team dynamics behind giving bad time estimates for completing a project, and how one of the most professional things you can do is sometimes to say "no" to tasks.

Yoda and Robert Martin have a lot in common There is no "try".

He also has some very insightful things to say about the dark side of 'Agile' practices, and how driving too hard toward a goal just-this-once is never a good idea. If nothing else, he points out, it doesn't even get you what you wanted! The code produced in those sort of conditions is invariably horribly flawed "3am code" and/or turns out to solve the wrong problem because those creating it were too busy trying to push something out the door to pay attention to the changing needs of their client. I'm reminded of this blog post from On Food and Coding:

With a given division of labour and a given level of skill, the proprietor can keep more money for themselves if they can get their staff to work for longer each day or to work with greater intensity. And when we deconstruct the terminology used by the most popular Agile process, Scrum, we find an unfortunate allusion to work at a greater intensity. For example, the week-by-week cycle of development is called an "iteration" in practically every other software process, including Extreme Programming. In Scrum this time period is called a Sprint. What associations does that conjure up? Well when you sprint you make an extreme effort to go as fast as possible, an effort that you can only keep up for a short time. That's what "sprint" means in everyday life.

Martin viscerally describes the sort of race-to-the-bottom conditions that this, coupled with optimistic overestimation of one's own ability to finish tasks, can induce.

Some reservations

One thing I disagree with is the amount of time he suggests allotting to your own personal practice - pairing with new people, doing Code Katas, listening to podcasts on software, picking up a new language, generally honing your craft. I absolutely think it's important to make sure you do some of this stuff outside of work - one of the reasons I started this blog is to motivate myself to learn more by providing a place to write about what I'm learning and have some accountability. But the time he suggests blocking out (20 hours!) is just unrealistic for most people. I think this very high bar is more likely to discourage would-be software folks from jumping into the field. I think understanding more about Martin's background explains where this comes from - he has a typically American 'Sleep is for the Weak!' attitude which has only partly been mollified by his realisation that if you work for too long, you stop producing things of worth because you're unable to concentrate well enough. Martin doesn't quite go far enough with his observations on 'Focus manna' to encompass leisure-time programming as well. Following his arguments, I have limited time in the week to practice my craft before I'm too tired for it to be of use - and I'm sure that line falls in different places for different people.

Jan 23, 2017

Stop Writing (Unecessary) Classes

I listened to a great talk recently called Stop Using Classes by Peter Diedrich. The talk goes through some of the unhelpful ways classes can be used in Python and makes good points about readability and simplicity. Here's my mindmap of the talk. My mindmap of the talk

Game of Life

Diedrich uses as an example a neat implementation of Conway's Game of Life which is beautifully straightforward and avoids the tempting trap of making a class for a Cell, a class for a Board, etc. His example is so small I've copied and annotated it here (renamed a few of the variables for extra clarity):

""" Example Game of Life from 'Stop Writing Classes' talk by Jack Diedrich. """
import itertools


def neighbours(point):
    x, y = point
    yield x+1, y-1
    yield x+1, y
    yield x+1, y+1
    yield x, y-1
    yield x, y+1
    yield x-1, y-1
    yield x-1, y
    yield x-1, y+1


def advance(board):
    new_state = set()  # initialises a set with a blank list inside
    friends = set(itertools.chain(*map(neighbours, board)))
    cells_we_care_about = board | friends

    for point in cells_we_care_about:
        count = sum((cell in board)
                    for cell in neighbours(point))
        if count == 3 or (count == 2 and point in board):
            new_state.add(point)

    return new_state


glider = set([(0,0), (1,0), (2,0), (0,1), (1,2)])

for i in range(1000):
    glider = advance(glider)

print glider

What friends does requires some unpacking. First of all, *map(neighbours, board) takes the iterable returned from map(neighbours, board) and returns it. The * is a useful way to split up the returned single thing from map into each piece and feed them separately to the chain() function as arguments. Here's an example of * in action:

def hallo(arg1, arg2):
     print(arg1)
     print(arg2)
     print("Hallo!!")

>>> printme = ("Yes", "Indeed")
>>> hallo(printme)
Traceback (most recent call last):
  File "/usr/lib/python3.5/code.py", line 91, in runcode
    exec(code, self.locals)
  File "<input>", line 1, in <module>
TypeError: hallo() missing 1 required positional argument: 'arg2'
>>> hallo(*printme)
Yes
Indeed
Hallo!!

This map() iterable is the function neighbours() applied to every point in board. So map(neighbours, board) returns sets of all the live cells' neighbours. This is used as an argument to the chain() function. What chain() does is glues groups of iterable things together into a chain object. For example:

>>> import itertools
>>> love = itertools.chain("123", "DEF")
>>> print love
<itertools.chain object at 0x7fe58ada5810>
>>> for item in love:
...     print item
...     
1
2
3
D
E
F
>>> sling_it_into_a_set = set(love)
>>> print sling_it_into_a_set
set(['D', 'E', 'F', 1, 2, 3])
>>>

So it bungs all the sets of neighbours into one single set. Because of how Python sets work, this gets rid of any duplicate points - eg points which neighbour a couple of living cells should still only be mentioned once.

| is just Python's binary OR operator - so cells_we_care_about is a set which contains all points that are in the list of live cells OR the list of their neighbours

count counts neighbours for each living and dead cell. Diedrich then applies the rules for Conway's Game of Life in one line: A dead cell that has 3 live neighbours becomes alive A live cell that has 2 live neighbours remains alive * Every other cell dies.

Sets are awesome

One of the reasons his implementation is so tidy and has so few lines is his great use of set(), one of the built-in Python types. Sets are particularly great for this problem because they contain only unique elements - if you try to add multiples of the same element to a set you only end up with one in there. This means any duplicate points are eliminated without having to think about it.

For more examples of how to get good use out of Python's built-ins I recommend Built in Super Heroes by the excellent Dave Beazley.

Jan 17, 2017

Moving to Pelican

I decided to make a personal website a while back, to play around with web stuff for the first time. Taking the path of least resistance I used Github Pages - they give you a free 'Personal' static website which is rather nice. I cobbled together some CSS and HTML and made something that looked as if it had emerged from the 90s. It was perfectly fine for what I wanted to do, which was host little JavaScript games I was playing with writing - I had never written JavaScript before and wanted to give it a try.

Now I'd like to make this website into a blogging platform, and make it look a bit better, too. Rather than hand-coding everything from scratch (and still having something that looks as if it emerged from the 90s), I wanted to start out with something I can modify for myself with only a bit of effort.

I explored a bewildering range of different static website generators, trying to work out what I should use. According to this list there are over 100 of these out in the wild now! Even narrowing myself down to only the Python flavours (on the theory that I want to be able to tinker deeply with the webpages I create) left me with far too much to choose from. So I asked around friends and on the advice of Hannah McLaughlin I went for Pelican.

Getting Pelican up and running was not as easy as the docs would like you to believe (ah, if only...). So as nearly all first-time Pelican users must do, if the examples in the Pelican Themes page are anything to go by, my first blog post here will be a tutorial - mostly for my own reference - on how to use Pelican. I found that Amy Hanlon's post on the subject was most helpful, as she doesn't tend to assume any 'obvious' prior knowledge from the reader.

First Steps

In the terminal, set up a new virtual environment for all your Pelican business (good practice). Then make a directory for the website and initialise Pelican in there. These steps went without incident.

mkvirtualenv pelican
setvirtualenvproject
workon pelican

pip install pelican
pip install Markdown
mkdir Website
cd Website
pelican-quickstart

Pelican's quickstart does allow you to get off the ground pretty fast by setting things up the way you'd like after asking a series of questions. There was an option to say I wanted to upload my site using GitHub Pages - perfect. Say 'yes' to auto-reload & simple HTTP script to assist with developing theme and to create Fabfile/Makerfile. It spits out some basics:

content            Makefile        pelicanconf.pyc
develop_server.sh  output          publishconf.py
fabfile.py         pelicanconf.py  

pelicanconf.py is where your local configuration sits - that's what you want to alter when you're tweaking the webpages you display. publishconf.py is the file that deals with published web pages, eg analytics etc - we're not going to mess with that. content is the folder where you bung all your Markdown (.md) or reStructuredText (.rst) files for all those great blog posts. Pelican will also recognise the folder names images and pages automatically - these are where your pictures and the different webpages you create (aside from the blog posts) will go. output is the folder where the final CSS and HTML live, and it's the part that should be posted to wherever you're doing your web hosting.

Problem 1 - no Themes

I try and host my new website locally to see what it looks like - and it looks like crap! It's because I don't have any of the lovely themes loaded. I try exactly what the Pelican Themes page recommends: git clone themes into ~/home/user/pelican-themes and add a line in the pelicanconf.py file, like so:

THEMES = "/home/user/pelican-themes/blue-penguin"

Sanity check: do I have any themes installed?

(pelican)sandmanuser@sandman-VirtualBox:~/Website$ pelican-themes -l
notmyidea
simple

No. How strange. I did clone pelican-themes locally:

git clone https://github.com/getpelican/pelican-themes

But since I didn't clone --recursively, the actual configuration files for each theme weren't copied. When I go in and look, it's just a bunch of empty folders. Yes, because now when I try to clone it takes forever! Now we're cooking on gas, and when I run make html and then make serve I see that it's looking pretty. Ctrl+C stops the server.

Problem 2 - not updating local server

I discover what must be another common problem - I go change a webpage to try something else out, hit make serve and find that it won't run:

socket.error: [Errno 98] Address already in use
Makefile:77: recipe for target 'serve' failed
make: *** [serve] Error 1

Woops! Looks like I didn't actually switch off the server I made to try things out - it was still running in the background. It turns out the nicest way of dealing with this is to use the shell script they've kindly made for you - just run ./develop_server.sh start and ./develop_server.sh stop to start the development server and stop it again nicely. This also updates your webpages locally live - as soon as you make and save a change to them, the development server updates your webpage so you can see how things look in real time.

Why didn't you tell the world?

So far this website is only working on a little server-in-a-bottle we've created in our own computer. Time to take it to the web! Since I'm hosting my site with GitHub Pages, my user site on GitHub is where I'm going to push the contents of my output folder to.

I'm going to create a separate repository for the rest so I can back up my Markdown files, tweak my Theme and rebuild the website from different computers.

This requires a little careful thought when it comes to .gitignore and how we refresh the output folder.

And once you've gotten everything working...

You might want to do a freeze to capture the packages you installed & make it easier on yourself moving to another computer.

pip freeze > requirements.txt
← Previous Page 2 of 2