Loading…
Attending this event?
Friday August 9, 2024 12:15pm - 2:15pm IST
Authors - Shreyas Unnibhavi, Nishant Deshpande, Srinivas Surpur, Satish chikkamath, Nirmala S R, Suneeta V Budihal
Abstract - The study focuses on enhancing text summarization techniques. Summarization is a key preprocessing step that aids in the extraction of essential features from the text. This helps to understand whole content in less time. The purposed model employs LSTM(long short term memory) network. LSTM is more handy in capturing conceptual information. A hierarchical LSTM is more efficient in capturing and preserving the data than that of standard LSTM. The three components of the model being an encoder, decoder and funnel network. funnel network connects the components to flow the information. Text preprocessing is important forensuring the quality and cleanliness of the input data. The model aims to enhance the text summarization methodologies and improved representation of textual information. Transformer model is used for extracting the text. Category Cross-entropy is a way for the computer to measure how different is the predicted summary from that of the actual summary sparse represent each target word in the main text with an integer index corresponding to its position in the vocabulary. Sparse category crossentropy is the loss function in this model. Optimizer used in this model is root mean square propogation.
Paper Presenter
Friday August 9, 2024 12:15pm - 2:15pm IST
Virtual Room C Goa, India

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link