SURVEY ON FLOOD FORECASTING METHODS
SANGEETHA.S1 JAYAKUMAR.D2 PG Scholar, Department of Computer Science & Engineering, IFET college of Engineering, Villupuram.
Associate Professor, Department of Computer Science & Engineering, IFET college of Engineering, Villupuram.
ABSTRACT
Artificial intelligent models (AIMs) have been successfully adopted in hydrological forecasting in a plenty of literatures. However, the comprehensive comparison of their applicability in particular short-term (i.e. hourly) water level prediction under heavy rainfall events was rarely discussed. Therefore, in this study, the artificial neural networks (ANN), Intelligent multi agent approach, Markov Chain Monte Carlo (MCMC) were selected for
…show more content…
Flood warnings must be provided with an adequate lead time for the public and the emergency services to take actions to minimize flood damages.
Real time flood forecasting is an important and integral part of a flood warning service, and can help to provide more accurate and timely warnings. Depending on catchment characteristics and catchment response to rainfall, various types of flood forecasting models, including correlations, simple trigger flood forecasting, and more sophisticated real time catchment-wide integrated hydrological and hydrodynamic models may be adopted. These models provide flow and level forecasts at the selected key locations known as Forecast Points, which are usually located along major rivers or on streams near urban areas that have a history of flooding.
2. ARTIFICIAL NEURAL NETWORK: ANN consists of a large number of parallel processing neuros, working independently and connecting to each other by weighted links. It is capable of simulating complex nonlinear system due to its ability of self-learning, self adaption and generalization. The feed forward neural network (FFNN), with one input layer, one or more hidden layer and one output layer, is employed in this study. BP algorithm, firstly introduced by Rumelhart, is employed for training. The global error
Recently in English we read “There Will Come Soft Rains” by Ray Bradbury. It takes place in 2026 and it explains what it will be like in the future. The story talks about a house that is ran by machines. In this paper, it will talk about the theme, what the story is about, literary terms, and Bradbury's message on the future.
ABSTRACT- An Artificial Neural Network (ANN) is an information processing paradigm that is inspired by the way biological nervous systems, such as the brain, process information [1]. Artificial Neural Networks (ANN) also called neuro-computing, or parallel distributed processing (PDP), provide an alternative approach to be applied to problems where the algorithmic and symbolic approaches are not well suited. The objective of the neural network is to transform the inputs into meaningful outputs. There are many researches which show that brain store information as pattern. Some of these patterns are very complicated and allows us to recognize from different angles. This paper gives a review of the artificial neural network and analyses the techniques in terms of performance.
According to the Census of 2013 the city of Houston, Texas carries a population of 2.196 million. Houston is located by the Gulf of Mexico in Southeast Texas and is part of the Harris County. Houston is best described as having a humid subtropical weather.
“‘Today is August 5, 2026, today is August 5, 20206, today is…’” (Bradbury 7). In Ray Bradbury’s short story, “There Will Come Soft Rain” The House is very high tech, efficient, and helpful. The story takes place in August, 2026; and shows what life could possibly be like if we do not take care of our enviroment.
The tendency towards more intense precipitation events is projected to the future. The more precipitation, the more risk of flooding.More heavy downpours may increase the likelihood of property damage, travel delays and disruption in
The following steps are used to design the back propagation neural network algorithm for the proposed research work. The first step is to set the input, output data sets. The second step is to set the number of hidden layer and output activation functions. The third step is to set the training functions and training parameters, finally run the network.
The training is divided into two phases: learning phase and testing phase. In the learning phase, an iterative which updated the synoptic weights is formed upon the error BP (Back Propagation) algorithm. In the testing phase the number of input and output parameters as well as the cases number influenced the neural network,whereas the trained results is then compared to the target to make a decision about the continuing of the iteration or the obtained results is concluded. The common ANN structure for the three architectures is (3X3), which means three neurons in the input layer and three neurons in the hidden layer. The training of each ANN architecture designs are shown in the following: fig.3, fig.4 and fig.5,
Ignorance is something that should be avoided regarding this matter. Know as much as you can when it comes to flooding areas and flood zones and flood coverage and rates.
Information that was used to develop my conclusions were relative to the history of Clearwater River such as the average normal discharge of 40,000cfs, the fact that the river can accommodate 55,000cfs before flooding will occur, and that with every increase of 2600cfs the river rises one foot. For example, this information helped to determine that four of the thirteen noted Peak Flood Discharges listed on the worksheet were not at flood stage as the river is able to accommodate discharges less than 55,000cfs. However, nine of the thirteen Peak Flood Discharges listed on the worksheet were at flood stage or drastically above; with the maximum rise of 17.77 feet above and a discharge rate of 101,200cfs. The average discharge rate of the top three noted floods per information on the Stream Gauge Data of Peak Flow Discharges is 93,613cfs, which is resourceful later in determining the extrapolated 75-year flood.
The primary goal of flood hazard mapping is to act as an information system to raise our understanding and awareness of coastal risk. Flood hazard mapping is essential for accurately designing and planning for flood-prone areas. It provides easily understood and easy to read maps which includes of identification of areas prone to flood. Most of the time, these flood hazard maps are associated with rapidly- accessible charts. With all of these, it can greatly help lessen casualties and pose for good response efforts when flooding occurs. Flood hazard maps are designed to increase awareness of the likelihood of flooding among public, local authorities and other organisations. People who live in areas that are prone to flood are also encouraged
Based on model tests, Civica will propose the modelling approach for Don River watershed. At the proposal stage, Civica would propose two approaches. The first approach is to create two models with different channel lengths. For small storms, the full length will be used. And for large storms, the length of valley corridor will be used. As the calibration and validation storms are usually small storms. The large storm model can’t be calibrated. It also increase the difficulty to manage different models. The second approach is to change the model algorithms to dynamically change to corresponding channel length based on water level. A threshold water level could be given to each channel element. If water level exceed the threshold water level,
“The Year of the Flood” is an epic, sprawling novel that moves back and forth between past, present and future effortlessly. Though it is told from Ren and Toby’s point of view, the novel is really about the story of three women (Ren, Toby, and Amanda) and their will to survive in a cruel and harsh world. It is a story of hope, despite all odds and a story of the power of love.
In the first chapter of this book, the author explains machine learning by introducing the relation among algorithms, patterns, predictions, and data. Then, he describes the diverse ways of machine learning operation. In addition, he uses statistics to clarify the discussed ideas. For example, he shows organized data on charts, tables, and graphs to make them more understandable. The publication date is just four years ago. This chapter is relevant to the research question. The author is a professor in a department of computer engineering, and he mentions the references of the information provided at the end of this chapter. The information is educational and scientific.
Following, a Puls flood routing method was used to simulate the event again with theoretical values. The relationship between discharge and discharge with storage over time was obtained and used to calculate theoretical values of discharge and water Level. These were compared with experimental results and a strong correlation was observed.
The novel The Year of the Flood by Margaret Atwood takes the Biblical origin story of man and creates a Garden of Eden for the end of the world. The novel’s allusion to Genesis adds to the message that the hardships befalling humanity and the destruction of the earth are due to man’s sinful nature just as Adam and Eve’s punishments and expulsion from paradise were a direct result of their sin. Adam and Eve, the first to live immorally on earth, are now recreated into a group of people that will witness the world’s end in a reflection of its beginning. The new Garden of Eden is created and lost in a time when new intelligence is created by man and evil fights for the hold of man once more. However, just as the expulsion from the original Garden led to the beginning of the world known today, the expulsion from the second Garden leads to the creation of a new world where humanity is largely absent.