Tsunami deposits refer to sedimentary units that are accumulated due to tsunami. Deposits of this nature can be left offshore during the back wash phase or onshore in the inundation phase.
Tsunami deposits assist in the determination of past events of tsunami, and estimation of the tsunami and earthquake hazards constraints. However, it has been of great challenge to determine the difference between deposits caused by storms, tsunamis, and other related sedimentary processes (Bernard & Robinson, 2009). It is worth noting that well recorded historical tsunami deposits can be compared with those of storm events. For onshore deposits in both cases, the over-wash deposits are always found along the low-lying areas in the coastline. Such depositional environments as slow swamp in lacustrine sedimentation, lead to fine grained sediments. Yet, the presence of eroded materials from the shelf is in most cases used to explain the possibility of a tsunami rather than a storm. Tsunamis are associated with greater erosive power and energy. In addition, large boulders movement is also associated with tsunami origin. Sediments that are not deposited onshore are accumulated offshore. These are the sediments that settle in the shallow water or gate involved in debris flows (Karan & Subbiah, 2011).
Tsunamis that have been observed, measured, and documented in a written form or orally, are considered as historic. Some of the historic tsunamis include those that hit Eastern Scotland, Northern France, Southwestern Thailand, Southwestern Spain, Southwest coast of India, and East Japan coast among others. The given paper will make attempts to explain the nature of various tsunamis in the above-mentioned areas. Tsunamis deposits in the areas mentioned above will be highlighted, describing how they all happened. In addition, a literature review of the various tsunamis will be outlined to come up with the history. Methodology of obtaining information will be given to find out the various sources of information used in the paper. Data analysis of the tsunami histories recorded will be mentioned, and the results will be provided as well. Finally, conclusion and recommendation will be given.
Tsunamis refer to a series of waves or long wavelength of up to 100 km and long periods of up to 10 minutes to 1 hour that can travel for as long as 1000 km per hour in the open ocean (Bryant, 2008). They are caused by enormous disturbances that lead to displacement of large water volumes, commonly caused by displacement at the sea floor during earthquakes. In addition, such events as volcanic eruptions, bolide oceanic impacts, and submarine landslides can also cause tsunamis. In most cases, height of tsunamis in the open ocean can only be one or two meters, however, as it approaches shallow waters, it slows down and starts shoaling, which leads to an enormous increase in the height of the waves. Damage caused by the tsunami is as a result of inundation, sediment erosion, wave impact, and deposition. As a result, the larger the tsunami, the greater the impact. It is worth noting that the inundation and the run-up height of the tsunami varies greatly within short distances due to land surface and wave complex interactions.
It is important to mention that fine and medium size sand sediments are restricted to few meters or few distances, whereas finer sediments are suspended several meters away. Although the local topography affects tsunami deposits thickness, thin landwards has an effect as well. Areas with low relief topography landward thinning rate are exponential, thus, reflecting sediment supply dominance in near shore areas, compared to further islands. Most of the tsunami deposits are made up of sand and mud layers. The deposits are classified into the following two groups: mud-dominated and sand-dominated deposits. Sand-dominated deposits are mainly distributed within the first 2300 meters from the coastal area, whereas mud-dominated ones are generally distributed in the farther landward (Putra, Nishimura, Nakamura, & Yulianto, 2013).
Before 1990s, there was a public belief that tsunamis originated from long distances under water, mainly from the Pacific Ocean (Satake, Okal, & Borrero, 2007). Those days, people feared of tsunamis because of the early warnings that were aimed at preventing loss of property and lives. In the then 1900s, there were 14 major tsunamis that hit the world’s coastlines. It resulted in numerous deaths, loss of property, and major economic downfalls. The events then made the scientists to be aware of the fact that tsunamis are pervasive in nature. The mentioned tsunamis were generated by either submarine landslides or earthquakes. At that time, there were not many warnings announced to the locals, hence, leading to the enormous loss of property and lives. As a result of this perception, there was a major shock on the 26th of December 2004, as one of the most powerful earthquakes was recorded that centered along the coast of Northern Indonesia. The generated tsunami wave swept the northern part of Indian Ocean, leading to numerous deaths (Cuven et al., 2013).
It is worth noting that a big tsunami can occur without warning even in a world where the contemporary technology is expected to save everyone. There has also been evidence of major mega tsunamis that are ten times larger than that of Indonesia. This happened along the heavily protected coastline of Eastern Australia. The tsunami had a run-up height that had ever been documented in the world in the past 5,000 years. Such events have been repetitive, although people might not be having a clear memory of this event, since there were no historical records documented. However, aboriginal legends did record such occurrences orally, and the current ongoing research suggests that these major tsunamis are widespread. The effects are dominant in Eastern Scotland, Australian coast, and New Zealand. The origin of such mega tsunamis has been contentious, but this is likely to have been due to the impact of asteroids, submarine landslides, and collision of asteroids with the world’s ocean. The latest discoveries and occurrences have vital implications, bearing in mind that the western civilization is unique in development of coastal cities and settlement of shorelines (Babu, Suresh Babu, & Mohan Das, 2006).
In every research, one can only use primary or secondary sources of data. Primary sources are those obtained first hand from the field. It can be acquired through observation, use of questionnaires, and interviews. When an individual uses observation he/she has to visit the place in question, and observe the area or events as they happen. Once that is done, information obtained is kept in writing. Use of questionnaires involves a set of questions put in writing to be filled by an interviewee. Such questions can be posted to the respective areas, or the interviewer can just present them to the interviewees directly to be filled. Interviews can either be direct or indirect. Direct interviews involve a face-to-face dialogue between an interviewer and interviewee. The interviewers provide questions to an interviewee, who then responds to them, as the interviewer records the response. Indirect interviews are conducted on distance, where an interviewer sends a set of questions to an interviewee, who does not have to be in the same area with the interviewer. The interviewee fills in the questions and sends them back to the interviewer for review. Secondary sources of information involve the use of data that had already been collected and kept in records. Such records can be books, journals, electronic database, magazines, and articles among others. In this paper, secondary sources of information such as books, journals, and electronic databases are used.
Don't waste your time on boring tasks!
Save your time for something pleasant!
The coastal eastern part of Scotland has the most extensive and detailed evidence of sizable change of sea level in Britain. Some of the most prominent features include estuarine sediments that arise from Holocene age, and Carselands that have evidence locally of up to six shorelines. The highest shoreline of these is called main postglacial shoreline. This occurs at the Carseland land way limit and in most of the flat surfaces associated with such features. By radiocarbon, the main postglacial shoreline through estimates is presumed to age between 6,000 and 6,500 years. Carseland sediments are made up of salty clay, grey in color with sand lenses, local shells, and gravel. However, though these sediments are not uniform, there is a widespread and persistent layer of micaceous, grey, and salty fine sand, identified over a large area. It has been possible in several boreholes at the site from Dornoch towards the fourth valley and in some sections in Maryton. The layer is believed to have been deposited during a single and short-lived event. Recently, there have been claims that the three storage submarine slides of the Norwegian coast, may have resulted from tsunami (Long, Smith, & Dawson, 1988).
An unusual succession, locally deposited at the Jurassic cretaceous boundary, is associated with tsunami, when compared with the recent tsunami deposits. Such sedimentary succession includes soft sediments, clay containing marine and continental fossils, basal erosion, and erosional conglomerates. It is likely that the tsunami affected Boulonnais coastal area of London-Brabant mountain range. However, the origin of the tsunami is still unknown. It is suspected to have been as a result of earthquake, but other origins, such as giant landslide, extraterrestrial bolide impact, and volcanic eruption, may have also been involved. As it is commonly known, instantaneous events, such as earthquakes and tsunamis, heavily disturb the sedimentary environment (Schnydera, Baudina, & Deconinck, 2005).
To date, tsunamis are among the constant features of the world’s ocean. In 1990s alone, there were 83 tsunamis caused by earthquake. An average amount of up to 57 events in a decade has been recorded globally in historical times. Although there is no justification to believe that the tsunamis have had different frequencies in the past, whichever age considered, few examples of tsunamis have been described in literature. By the end of the tsunami, Boulannais region became a shallow epicontinental sea, lying close to a low relief area. The higher or upper region of Jurassic, which was a succession of Boulonnais, is made up of sandstones and mudstones deposited at the shore face and offshore. The vertical succession of the facies has a record of tectonic rejuvenations that are episodic, and sea level fluctuations of the Brabant massif in London. At the European basin, the upper Jurassic has a characteristic of a well-known secondary order regressive trend. Along the coastal cliffs between Boulogne and Ambleteuse, this stratus is well exposed (Shiki, Tsuji, Minoura, & Yamazaki, 2011).
It is essential to measure the grain size and thickness resulting from tsunami. Onshore grain measurements of the 2004 Indian Ocean tsunami deposits showed macroscopic horizontal variations, and gave a new insight concerning tsunami sedimentation (Fujino, Naruse, Matsumoto, Sakakura, Suphawajruksakul, & Jarupongsakul, 2010). The above-mentioned tsunami resulted in server beach erosion, at the river mouths, and shallow sea flours along the southwestern coast of Thailand. It also leads to deposition of sufficient sediments of sand in the land surface. In most cases, tsunamis deposit fine landward, accompanied by some fluctuations due to sediments settlement and local entrainment.
While analyzing the texture and size of grains in Thotapalli-Valiazhikal after the Indian Ocean tsunami, it was discovered that uniform sizes of grains in several heavy minerals have changed to be non-uniform. In addition, the nature that has all along been well sorted has changed into moderately sorted, while uni-modal distribution has become bimodal in nature. These indicators are enough to suppose that there could have been more than two energy sources involved in the sediments redistribution. The dominance of saltation in the earlier regime has transformed to saltation and traction load during this tsunami period. In this case, mineralogically, there is a rising trend in the total number of heavy minerals resulting from tsunami waves. Such heavy minerals include sillimanite and limonite, and the percentages garnet started to rise abruptly. Furthermore, there is an increase in the magnetite percentage found in the samples of post-tsunami that might have resulted due to tsunami’s contribution. More alterations are also noticed in the minerals of the post-tsunami samples, especially in limonite. As a result of tsunami waves, significant reworking of shelf sediments, the mineralogy, and texture of onshore deposit may have got completely changed, up to about 20-50 cm in depth.
The 2011 Tohoku-Oki tsunami left deposits along the central east coast of Japan. After the catastrophe, the studies were conducted in order to identify the sources of tsunami deposits and the processes involved in a narrow valley, which has low topography. Most of the sand layers have a characteristic of laminations by heavy minerals. These laminations are associated with coarsening and massive upward evidence for bed load process. Finally, it is essential to note that tsunamis are dangerous in nature; as a result, it is of a high importance for scientists to come up with a way of predicting their occurrence. It will help in prevention of casualties, property damage, and even economic downfalls in most of the affected countries. Prior warning of citizens helps them to prepare themselves in advance, and put their property in safe custody from being lost (Fujino, Masuda, Tagomori, & Matsumoto, 2006).
The case study “How Will Astronomy Archives Survive the Data Tsunami?” brings out the fact that astronomy has begun generating more data that are becoming difficult to manage, serve and process with the current methods available. The case study states that the rate at which the data is growing is 0.5 PB each year, and that over 60PB of archived data will become available by 2020. It further outlines ways in which next generation methods and tools of handling the numerous data can be developed. This paper takes a critical look at the case study analyzing and assessing the highlighted issues regarding performance degradation, evaluating the archival technique and developing technologies discussed in the study. The paper also evaluates the offered methods of reducing the potentially big financial and computational costs resulting from the data archiving.
Our Benefits
Increased data sizes and their usage expected in the next few years have resulted in astronomy archives and data centers performance being affected. The affected data centers and archives include the NASA/IPAC IRSA. These archive and data centers are experiencing increased data sizes and usage as it is responsible for the storage and hosting of the data from the Spitzer Space Telescope and WISE mission. It should be noted that these two missions have generated more data volume than that of over 35 missions that are already archived.
The case also points out that the usage of the archive has been driven up by the data availability coupled with fast growth in program-based queries. This has greatly contributed to performance degradation of the archive, and this could even become worse as the new sets of data are made available through the archive. Increased requests for bigger data sizes have also contributed to the degradation as it has caused a fall in response time to queries.
The case further points out the fact that the archives operate on low budgets that are usually fixed over several years. Consequently, adding other infrastructures as usage of the archived data increases will not offer the best solution to the dwindling performance. It also notes that as archive holdings grow, the demand for data also grows with more sophisticated queries being raised since astronomical objects change over time resulting in the falling performance of the data centers and archives.
The case brings out various archival techniques that can be used to handle the growth of archived data. This includes cloud storage, use of R-trees, use of Geographical Information systems and Graphical processing units. Cloud storage will work well with applications requiring much memory and processing as the cost of processing is low. R-trees, on the other hand, are used for indexing multidimensional data thus increasing access times speeds while Geographical Information systems “store information about the world as a collection of thematic layers that can be linked together by geography” (Bayfieldcounty, 2013). Lastly, the graphical processing units can be used to speed up the output of a picture or an image on a display device.
All these methods can be used, but they might not be able to handle the increasing amounts of archived data. For instance, for cloud computing to achieve the best performance, high-throughput networks and parallel file systems would be required given the large quantities of image data in astronomy. Use of cloud technology thus would be costly and uneconomical and apart from that other disadvantages such as lower Internet bandwidth leading to decreased performance make it unable to handle growth of archived data (Cloud Consulting, 2011).
On the other hand, Geographical Information systems are expensive and much more complex for use in astronomy. The high costs and complexity make it unsuitable for use in astronomical data archiving. Graphical processing units can be used, but they only support single-precision calculations yet astronomy more often needs the double-precision calculations and their performance are often confined by the data transfer to and from the GPUs.
Emerging technologies such as clustering, incorporation of the Montage image mosaic engine and use of infrastructures such as the SciDB database could prove to be effective in solving the data archival problem in astronomy.
The clustering technology involves bringing a set of computer processors together to create a super computer system. A processor is usually called a node and has its own CPU, memory, operating system, and I/O subsystem and can communicate with other nodes. Clustering enables heavy programs that would take a lot of time to run to be able to run on regular hardware (Narayan, 2005).
Montage image mosaic engines are toolkits used by the astronomers to create astronomical images into mosaic image. It has been tested on operating systems such as Linux, Mac OS X and Solaris. The engines produce an image mosaic following four steps with Montage implementing each of them as separate and independent modules on files that follow the flexible image transport system format. This format has become the standard used by astronomers as the files are human readable form and the system is convenient in manipulating and managing large image files (Medina, 2007).
SciDB database, on the other hand, is an open-source software that creates the next generation computing database for the data scientists such as astronauts, bio-informaticians and any other field that used huge volume of many-dimensional data such as genomic data and geospatial data. The system combines analytical capabilities with data-management capabilities to support complex and flexible analytics. The system is declarative array which is oriented and extensible (Cudré-Mauroux, 2010).
The case study proposes various methods of reducing the financial and computing costs that may be incurred while archiving data. This includes use of Graphical processing units (GPU), using R-tree based indexing schemes and academic clouding.
Graphical processing unit is made up of floating-point processors and when used may benefit some applications such as fixed-resolution mesh simulations, machine-learning and volume-rendering packages that run on it. GPU computing uses graphics processing units combined with a CPU to speed up general-purpose of scientific applications and also engineering applications. GPU computing offloads compute-intensive portions to the GPU, while the other part of the code is kept running on CPU and thus it makes applications run faster. GPUs are generally used in manipulating computer graphics and prove to be more effective than general-purpose CPUs in areas where large blocks of information are processed in parallel (Nvdia, 2013).
R-tree based indexing systems, on the other hand, support scalable and fast access to big databases with astronomical information and imaging data sets and is also important in indexing multidimensional information thus speeding up an access time. This technique is currently being worked on by the Virtual Astronomical Observatory in efforts to offer seamless data discovery services in astronomy. The system provides speed –ups that are way above database table scans and are already being used by the VAO Image and Catalog Discovery and Spitzer Space Telescope Heritage Archive.
Apart from that, academic clouding that is being used by Canadian Astronomy Data Center can also be applied by other archives. This clouding system enables the delivery, processing, storage, analysis, and distribution of every datasets of astronomical nature.
In conclusion, the amount of information being processed in astronomy is growing every year and new ways of managing, processing and storing as well as accessing such large quantities of data must be improvised.
How to make peace with academic deadlines?
Use our writing service!
Don't Waste Time