The evolution of forensic technology over the past decade can be described as exponential. In fact, in recent years new breakthroughs are being made the everyday single day.
companies have the right to share your genetic information with third parties – much to the advantage of investigators
Only a few years back, forensics were not able to accurately test samples and match them to right criminal. Many elements of forensics were underdeveloped and therefore caused a deficiency in the conviction process.
So where are we now? Where were we then? And how did it all start?
Generic fingerprinting can be traced back to Alec John Jeffreys, a British geneticist who, in the year 1984, discovered that all humans have deviations in their genetic code.
This discovery enables investigators to track DNA left at a crime scene to a single culprit – the first case of this being Colin Pitchfork.
Pitchfork, a double murderer and rapist, was the first person to be convicted using DNA testing forensic technology in 1988.
The semen left at the crime scene was single handily compared to 5,000 local male suspects before matching Pitchfork.
This manual comparison is practically foreign in this day and age, due in large part to the founding of the National DNA Database in 1995 which, by 2019, has upward of 6 million profiles – but DNA fingerprinting is still evolving.
More than 20 million DNA kits have been sold worldwide through various companies.
However, most don’t know these companies have the right to share your genetic information with third parties – much to the advantage of investigators.
In 2018, police were able to trace the family tree of ‘The Golden State Killer‘ Joseph James DeAngelo, a murderer, and rapist who alluded authorities since his crime spree ended in March of 1990.
Although the history of fingerprint identification is a long and interesting one – the first official recorded case coming from China’s Qin Dynasty in 200’s BC – its prominence in 20th-century forensic identification can arguably be attributed to two men – Eduardo Alvarez and Francis Galton.
In 1892, Argentinian inspector Eduardo Alvarez was able to match a bloody handprint to Francisca Rojas, the mother of two recently murdered sons who slit her own throat in an attempt to avoid suspicion.
In the very same year, Sir Francis Galton, an English Victorian statistician, published his text Finger Print in which he conclusively proved what many had suspected for centuries – that fingerprints do not change over time and that each individual print is unique.
Galton’s classification of fingerprints (the loop, whorl, and arch) is still in use today.
In the 1980’s The Automated Fingerprint Identification System (or AFIS) first took shape and, by the late ’90s,
The Integrated Automated Fingerprint Identification System (or IAFIS) had upward of 31 million civilian prints available for criminal comparison.
However, within the past decade, fingerprint forensic technology has evolved exponentially – for instance, the WhatsApp fingerprint identification case.
A WhatsApp photograph partially showing three fingers holding an ecstasy tablet was reconstructed using pioneering techniques by South Wales Police leading to the arrest of Elliott Morris.
He, along with his parents, were sentenced to 8 and a half years for conspiracy to supply. Forensic scientist Dave Thomas had this to say about Morris’ arrest;
It is an old-fashioned technique, not new. These guys are using the technology not to get caught and we need to keep up with advancements.
While the scale and quality of the photograph proved a challenge, the small bits were enough to prove he was the dealer.
Blood Spatter Analysis
The earliest surviving study of blood spatter was conducted by Dr. Eduard Piotrowski in his 1895 paper On the formation, form, direction, and spreading of blood stains after blunt trauma to the head.
In this paper, Piotrowski details his experiment of bludgeon rabbit skulls against white sheets and study the pattern.
However, in recent years the study into bloodstain analysis has moved from strength due to the advancement of studies – take the work of Professor Alexander Yarin as an example.
In 2016, Yarin began researching back spatter blood resulting from gunshot wounds, leading to the development of a model in which the bloodstain pattern forensic technology can be both predicted and interpreted.
The next step in this advancement, according to Yarin, is research into forwarding blood spatter (blood which is traveling in the same direction as the bullet). So what does this all mean? Well, as described by Patrick Comiskey, an associate working under Yarin.
Mathematically, backward and forward spatter are very different. What we’ve been able to do is predict the location – with the mathematical model we created.
The intent is to better analyze crime scenes and to look at a more accurate way of recreating [them].
In a nutshell, the forensic technology of ballistic testing can be defined as the study of the unique indentations left on a bullet.
Ballistic analysis can be split into three differing camps – internal, the study of the inside of the fired weapon, external, the flight of the bullet, and terminal, the behaviour of the bullet once the target has been struck.
The first successful forensic firearm examination was conducted in 1835 by Henry Goddard of the Bow Street Runners.
however, the most notable case of ballistic fingerprinting occurred in 1929 with the now infamous Saint Valentine’s Day Massacre case in which four gang members were executed using two machine gun conclusively proven to belong to members of Al Capone’s gang.
So how are ballistics tested? Well, the same tool used to incriminated Capone’s thugs is still very much prevalent all these decades later.
A comparison microscope is a device which allows two contrasting pictures to be analysed alongside one another in hopes of detecting a matching pattern. However, in recent years modern techniques have been applied to this long-standing practice.
Approximately a decade ago, ballistic began development on a 3D firearm test which aims to replace the soon-to-be outdated 2D technique.
Through 3D testing, data is gathered and compared to the firearm through a digital database (as opposed to manually observing the patterns) – meaning technology is relied upon as opposed to the human eye, leaving little to no room for error or doubt.
The birth of toxicology can be traced back to Paracelsus, a fifteenth-century physician, who famously stated:
“All substances are poisons; there is none which is not a poison. The right dose differentiates a poison and a remedy.”
However, it is Mathieu Orfila, a Spanish chemist and toxicologist, who is commonly regarded as the techniques modern father due to the publication of his 1815 text Traité des Poisons.
The first case of toxicology playing a key role in a criminal conviction happened just over 30 years later when Jean Stas discovered traces of nicotine poisoning on human tissue – proving without a doubt that the victim’s brother in law, Hippolyte Visart de Bocarmé, poised him to death with the motive of financial gain.
In the modern world, toxicology reporting traditionally goes hand in hand with autopsy report – during the corners examination.
Samples of bodily fluids are collected (including hair, blood, urine, and so one) and tested to discover drugs or poisons present in the body which led to the death, rape, or assault of the victim.
However, over the past few decades, there have been major advances in the field – perhaps the most interesting development being the introduction of in vitro toxicology.
In vitro toxicology, to sum it up broadly, consists of the analysis of either cells or tissue which is grown in a laboratory setting and examined when differing toxic properties are introduced/produced.
Why is this advancement in the field so important? To remember this, one must simply the three R’s – reduce, the number of animals subjected to toxicity testing will drastically decrease.
Refine, there are fewer biological variations associated with the in vitro technique and replace – the aim of in vitro testing is to make the prior form of toxicity testing, in vivo, obsolete.
3D Facial Reconstruction
Perhaps somewhat surprisingly, the first-ever successful 3D facial reconstruction from a skull occurred in 1895 when Wilhelm His produced his approximation.
However, the introduction of facial reconstruction into the world of forensics was in large part due to one man, William M. Krogman. Krogman, or the “Bone Detective” as he is commonly known, was a leading anthropologist.
He revolutionised the forensic technology of facial reconstruction due to the publication of his 1962 book The Human Skeleton in Forensic Medicine which contains definitive research still used in Investigations today.
So what is 3-dimensional facial reconstruction?
Well, in layman’s terms, this technique is an amalgamation of artwork and science. Forensic facial reconstruction takes a skull, usually from indeterminate origins, and artificially reconstructs the face with specific reference to skull shape and facial tissue depth which, in turn, gives a clear indication of the suspect/victims identity.
A perfect example of this practice is the reconstruction of famous faces throughout history – the most familiar being the face of King Tut.
Within the last decade, in particular, the popularity of forensic facial reconstruction has reached a fever pitch. And with renewed interest in this technique, many advances have come to the forefront.
Over the past decade or so, computer technology has advanced exponentially – meaning that 3D facial reconstruction has evolved along with it.
Computerised facial recognition counteracts many criticisms this technique has previously faced – the time consumption is decreased dramatically, the uncertainty of artistic interpretation is less, numerous variations of the face from differing angles can be produced with ease and, of course, less money and supplies are involved.
Digital Forensics and Monitoring
Perhaps the most obvious example of evolving forensic technology within the past decade is the advancement in digital and monitoring technology.
Everywhere we look there is seemingly a camera watching us. Street surveillance is nothing new, beginning its life in the 1970s and, by 2013, grew to an estimated 6 million street cameras in the UK.
In the 6 years since that number has only grown larger – and that’s not including car, body, and home surveillance equipment.
So why is CCTV such an important tool for investigators? Well, it is both a deterrent and an identifier. One is far less likely to commit a crime if there is clear evidence placing them at the scene.
However, the presence of CCTV is not always an effective deterrent and, unfortunately, many grizzly crimes have been captured on camera.
In April 2013, the footage of the Boston Marathon Bombing shocked the world. The footage, which was captured by CCTV cameras, showed the Tsarnaev brothers as they carried and planted the bombs that killed 3 and injured 200.
However, CCTV cameras are not the only things monitoring us (as well as the ones who wish to harm us) – so are phones and computers.
Our home technology can be traced, recovered, located, information retrieved, and so much more. Within the past decade, home technology systems have taken a substantial leap forward and, in the decades to come, one cannot even begin to predict the bright future of digital forensics and monitoring.
Hi there! I’m Heather and I’m a recent graduate achieving the title of Master of Arts with Honours in English and Film Studies. Along with my degree, I have acquired a HNC in the Social Sciences. In my free time I love to cuddle with my dog and immerse myself in all things True Crime and I am the new Crime Writer here at iTHINK Magazine.
Image respects to - History Scotland magazine