1. Deep Learning with Hadoop 1st Edition Dipayan
Dev install download
https://ebookmeta.com/product/deep-learning-with-hadoop-1st-
edition-dipayan-dev/
Download more ebook from https://ebookmeta.com
2. We believe these products will be a great fit for you. Click
the link to download now, or visit ebookmeta.com
to discover even more!
Deep Learning with Python 1st Edition Nikhil Ketkar
https://ebookmeta.com/product/deep-learning-with-python-1st-
edition-nikhil-ketkar/
Applied Deep Learning with TensorFlow 2: Learn to
Implement Advanced Deep Learning Techniques with
Python, 2nd Edition Umberto Michelucci
https://ebookmeta.com/product/applied-deep-learning-with-
tensorflow-2-learn-to-implement-advanced-deep-learning-
techniques-with-python-2nd-edition-umberto-michelucci-2/
Applied Deep Learning with TensorFlow 2: Learn to
Implement Advanced Deep Learning Techniques with Python
2nd Edition Umberto Michelucci
https://ebookmeta.com/product/applied-deep-learning-with-
tensorflow-2-learn-to-implement-advanced-deep-learning-
techniques-with-python-2nd-edition-umberto-michelucci/
Extreme Events in Geospace Origins Predictability and
Consequences Natalia Buzulukova Editor
https://ebookmeta.com/product/extreme-events-in-geospace-origins-
predictability-and-consequences-natalia-buzulukova-editor/
3. Aristotle s Revenge The Metaphysical Foundations of
Physical and Biological Science 1st Edition Edward
Feser
https://ebookmeta.com/product/aristotle-s-revenge-the-
metaphysical-foundations-of-physical-and-biological-science-1st-
edition-edward-feser/
Race and Racialization Essential Readings Second
Edition Tania Das Gupta
https://ebookmeta.com/product/race-and-racialization-essential-
readings-second-edition-tania-das-gupta/
Late Colonial French Cinema 1st Edition Mani Sharpe
https://ebookmeta.com/product/late-colonial-french-cinema-1st-
edition-mani-sharpe/
American Demon The Hollows 14 2nd Edition Kim Harrison
https://ebookmeta.com/product/american-demon-the-hollows-14-2nd-
edition-kim-harrison-4/
Invaders of the Rokujouma Volume 3 1st Edition Takehaya
https://ebookmeta.com/product/invaders-of-the-rokujouma-
volume-3-1st-edition-takehaya/
4. Managing Creativity A Systems Thinking Journey 1st
Edition José-Rodrigo Córdoba-Pachón
https://ebookmeta.com/product/managing-creativity-a-systems-
thinking-journey-1st-edition-jose-rodrigo-cordoba-pachon/
6. Table of Contents
Deep Learning with Hadoop
Credits
About the Author
About the Reviewers
www.PacktPub.com
Why subscribe?
Customer Feedback
Dedication
Preface
What this book covers
What you need for this book
Who this book is for
Conventions
Reader feedback
Customer support
Downloading the example code
Downloading the color images of this book
Errata
Piracy
Questions
1. Introduction to Deep Learning
Getting started with deep learning
Deep feed-forward networks
Various learning algorithms
Unsupervised learning
Supervised learning
Semi-supervised learning
Deep learning terminologies
Deep learning: A revolution in Artificial Intelligence
Motivations for deep learning
The curse of dimensionality
The vanishing gradient problem
Distributed representation
Classification of deep learning networks
7. Deep generative or unsupervised models
Deep discriminate models
Summary
2. Distributed Deep Learning for Large-Scale Data
Deep learning for massive amounts of data
Challenges of deep learning for big data
Challenges of deep learning due to massive volumes of data
(first V)
Challenges of deep learning from a high variety of data (second
V)
Challenges of deep learning from a high velocity of data (third
V)
Challenges of deep learning to maintain the veracity of data
(fourth V)
Distributed deep learning and Hadoop
Map-Reduce
Iterative Map-Reduce
Yet Another Resource Negotiator (YARN)
Important characteristics for distributed deep learning design
Deeplearning4j - an open source distributed framework for deep
learning
Major features of Deeplearning4j
Summary of functionalities of Deeplearning4j
Setting up Deeplearning4j on Hadoop YARN
Getting familiar with Deeplearning4j
Integration of Hadoop YARN and Spark for distributed deep
learning
Rules to configure memory allocation for Spark on Hadoop
YARN
Summary
3. Convolutional Neural Network
Understanding convolution
Background of a CNN
Architecture overview
Basic layers of CNN
Importance of depth in a CNN
8. Convolutional layer
Sparse connectivity
Improved time complexity
Parameter sharing
Improved space complexity
Equivariant representations
Choosing the hyperparameters for Convolutional layers
Depth
Stride
Zero-padding
Mathematical formulation of hyperparameters
Effect of zero-padding
ReLU (Rectified Linear Units) layers
Advantages of ReLU over the sigmoid function
Pooling layer
Where is it useful, and where is it not?
Fully connected layer
Distributed deep CNN
Most popular aggressive deep neural networks and their
configurations
Training time - major challenges associated with deep neural
networks
Hadoop for deep CNNs
Convolutional layer using Deeplearning4j
Loading data
Model configuration
Training and evaluation
Summary
4. Recurrent Neural Network
What makes recurrent networks distinctive from others?
Recurrent neural networks(RNNs)
Unfolding recurrent computations
Advantages of a model unfolded in time
Memory of RNNs
Architecture
Backpropagation through time (BPTT)
9. Error computation
Long short-term memory
Problem with deep backpropagation with time
Long short-term memory
Bi-directional RNNs
Shortfalls of RNNs
Solutions to overcome
Distributed deep RNNs
RNNs with Deeplearning4j
Summary
5. Restricted Boltzmann Machines
Energy-based models
Boltzmann machines
How Boltzmann machines learn
Shortfall
Restricted Boltzmann machine
The basic architecture
How RBMs work
Convolutional Restricted Boltzmann machines
Stacked Convolutional Restricted Boltzmann machines
Deep Belief networks
Greedy layer-wise training
Distributed Deep Belief network
Distributed training of Restricted Boltzmann machines
Distributed training of Deep Belief networks
Distributed back propagation algorithm
Performance evaluation of RBMs and DBNs
Drastic improvement in training time
Implementation using Deeplearning4j
Restricted Boltzmann machines
Deep Belief networks
Summary
6. Autoencoders
Autoencoder
Regularized autoencoders
Sparse autoencoders
10. Sparse coding
Sparse autoencoders
The k-Sparse autoencoder
How to select the sparsity level k
Effect of sparsity level
Deep autoencoders
Training of deep autoencoders
Implementation of deep autoencoders using Deeplearning4j
Denoising autoencoder
Architecture of a Denoising autoencoder
Stacked denoising autoencoders
Implementation of a stacked denoising autoencoder using
Deeplearning4j
Applications of autoencoders
Summary
7. Miscellaneous Deep Learning Operations using Hadoop
Distributed video decoding in Hadoop
Large-scale image processing using Hadoop
Application of Map-Reduce jobs
Natural language processing using Hadoop
Web crawler
Extraction of keyword and module for natural language
processing
Estimation of relevant keywords from a page
Summary
1. References
13. Credits
Authors
Dipayan Dev
Copy Editor
Safis Editing
Reviewers
Shashwat Shriparv
Wissem EL Khlifi
Project Coordinator
Shweta H Birwatkar
Commissioning Editor
Amey Varangaonkar
Proofreader
Safis Editing
Acquisition Editor
Divya Poojari
Indexer
Mariammal Chettiyar
Content Development Editor
Sumeet Sawant
Graphics
Tania Dutta
Technical Editor
Nilesh Sawakhande
Production Coordinator
Melwyn Dsa
14. About the Author
Dipayan Dev has completed his M.Tech from National Institute of
Technology, Silchar with a first class first and is currently working as
a software professional in Bengaluru, India. He has extensive
knowledge and experience in non-relational database technologies,
having primarily worked with large-scale data over the last few
years. His core expertise lies in Hadoop Framework. During his
postgraduation, Dipayan had built an infinite scalable framework for
Hadoop, called Dr. Hadoop, which got published in top-tier SCI-E
indexed journal of Springer
(http://link.springer.com/article/10.1631/FITEE.1500015). Dr.
Hadoop has recently been cited by Goo Wikipedia in their Apache
Hadoop article. Apart from that, he registers interest in a wide range
of distributed system technologies, such as Redis, Apache Spark,
Elasticsearch, Hive, Pig, Riak, and other NoSQL databases. Dipayan
has also authored various research papers and book chapters, which
are published by IEEE and top-tier Springer Journals. To know more
about him, you can also visit his LinkedIn profile
https://www.linkedin.com/in/dipayandev.
15. About the Reviewers
Shashwat Shriparv has more than 7 years of IT experience. He
has worked with various technologies on his career path, such
as Hadoop and subprojects, Java, .NET, and so on. He has
experience in technologies such as Hadoop, HBase, Hive, Pig, Flume,
Sqoop, Mongo, Cassandra, Java, C#, Linux, Scripting, PHP, C++, C,
Web technologies, and various real-life use cases in BigData
technologies as a developer and administrator. He likes to ride bikes,
has interest in photography, and writes blogs when not working.
He has worked with companies such as CDAC, Genilok, HCL,
UIDAI(Aadhaar), Pointcross; he is currently working with
CenturyLink Cognilytics.
He is the author of Learning HBase, Packt Publishing, the reviewer
of Pig Design Pattern book, Packt Publishing, and the reviewer of
Hadoop Real-World Solution cookbook, 2nd edition.
I would like to take this opportunity to thank everyone who have
somehow made my life better and appreciated me at my best and
bared with me and supported me during my bad times.
Wissem El Khlifi is the first Oracle ACE in Spain and an Oracle
Certified Professional DBA with over 12 years of IT experience. He
earned the Computer Science Engineer degree from FST Tunisia,
Masters in Computer Science from the UPC Barcelona, and Masters
in Big Data Science from the UPC Barcelona. His area of interest
include Cloud Architecture, Big Data Architecture, and Big Data
Management & Analysis.
His career has included the roles of: Java analyst / programmer,
Oracle Senior DBA, and big data scientist. He currently works as
Senior Big Data and Cloud Architect for Schneider Electric / APC. He
writes numerous articles on his website http://www.oracle-class.com
and his twitter handle is @orawiss.
16. www.PacktPub.com
For support files and downloads related to your book, please
visit www.PacktPub.com.
Did you know that Packt offers eBook versions of every book
published, with PDF and ePub files available? You can upgrade to the
eBook version at www.PacktPub.com and as a print book customer,
you are entitled to a discount on the eBook copy. Get in touch with
us at service@packtpub.com for more details.
At www.PacktPub.com, you can also read a collection of free
technical articles, sign up for a range of free newsletters and receive
exclusive discounts and offers on Packt books and eBooks.
https://www.packtpub.com/mapt
Get the most in-demand software skills with Mapt. Mapt gives you
full access to all Packt books and video courses, as well as industry-
leading tools to help you plan your personal development and
advance your career.
Why subscribe?
Fully searchable across every book published by Packt
Copy and paste, print, and bookmark content
On demand and accessible via a web browser
17. Customer Feedback
Thanks for purchasing this Packt book. At Packt, quality is at the
heart of our editorial process. To help us improve, please leave us an
honest review on this book's Amazon page at
https://www.amazon.com/Deep-Learning-Hadoop-Dipayan-
Dev/dp/1787124762.
If you'd like to join our team of regular reviewers, you can e-mail us
at customerreviews@packtpub.com. We award our regular reviewers
with free eBooks and videos in exchange for their valuable feedback.
Help us be relentless in improving our products!
18. Dedication
To my mother, Dipti Deb and father, Tarun Kumar Deb.
And also my elder brother, Tapojit Deb.
19. Preface
This book will teach you how to deploy large-scale datasets in deep
neural networks with Hadoop for optimal performance.
Starting with understanding what deep learning is, and what the
various models associated with deep neural networks are, this book
will then show you how to set up the Hadoop environment for deep
learning.
20. What this book covers
Chapter 1, Introduction to Deep Learning, covers how deep learning
has gained its popularity over the last decade and is now growing
even faster than machine learning due to its enhanced
functionalities. This chapter starts with an introduction of the real-
life applications of Artificial Intelligence, the associated challenges,
and how effectively Deep learning is able to address all of these. The
chapter provides an in-depth explanation of deep learning by
addressing some of the major machine learning problems such as,
The curse of dimensionality, Vanishing gradient problem, and the
likes. To get started with deep learning for the subsequent chapters,
the classification of various deep learning networks is discussed in
the latter part of this chapter. This chapter is primarily suitable for
readers, who are interested to know the basics of deep learning
without getting much into the details of individual deep neural
networks.
Chapter 2, Distributed Deep Learning for Large - Scale
Data, explains that big data and deep learning are undoubtedly the
two hottest technical trends in recent days. Both of them are
critically interconnected and have shown tremendous growth in the
past few years. This chapter starts with how deep learning
technologies can be furnished with massive amount of unstructured
data to facilitate extraction of valuable hidden information out of
them. Famous technological companies such as Google, Facebook,
Apple, and the like are using this large-scale data in their deep
learning projects to train some aggressively deep neural networks in
a smarter way. Deep neural networks, however, show certain
challenges while dealing with Big data. This chapter provides a
detailed explanation of all these challenges. The latter part of the
chapter introduces Hadoop, to discuss how deep learning models
can be implemented using Hadoop's YARN and its iterative Map-
reduce paradigm. The chapter further introduces Deeplearning4j, a
popular open source distributed framework for deep learning and
explains its various components.
21. Chapter 3 , Convolutional Neural Network, introduces Convolutional
neural network (CNN), a deep neural network widely used by top
technological industries in their various deep learning projects. CNN
comes with a vast range of applications in various fields such as
image recognition, video recognition, natural language processing,
and so on. Convolution, a special type of mathematical operation, is
an integral component of CNN. To get started, the chapter initially
discusses the concept of convolution with a real-life example.
Further, an in-depth explanation of Convolutional neural network is
provided by describing each component of the network. To improve
the performance of the network, CNN comes with three most
important parameters, namely, sparse connectivity, parameter
sharing, and equivariant representation. The chapter explains all of
these to get a better grip on CNN. Further, CNN also possesses few
crucial hyperparameters, which help in deciding the dimension of
output volume of the network. A detailed discussion along with the
mathematical relationship among these hyperparameters can be
found in this chapter. The latter part of the chapter focuses on
distributed convolutional neural networks and shows
its implementation using Hadoop and Deeplearning4j.
Chapter 4, Recurrent Neural Network, explains that it is a special
type of neural network that can work over long sequences of vectors
to produce different sequences of vectors. Recently, they have
become an extremely popular choice for modeling sequences of
variable length. RNN has been successfully implemented for various
applications such as speech recognition, online handwritten
recognition, language modeling, and the like. The chapter provides a
detailed explanation of the various concepts of RNN by providing
essential mathematical relations and visual representations. RNN
possesses its own memory to store the output of the intermediate
hidden layer. Memory is the core component of the recurrent neural
network, which has been discussed in this chapter with an
appropriate block diagram. Moreover, the limitations of uni-
directional recurrent neural networks are provided, and to overcome
the same, the concept of bidirectional recurrent neural network
(BRNN) is introduced. Later, to address the problem of vanishing
22. gradient, introduced in chapter 1, a special unit of RNN, called Long
short-term Memory (LSTM) is discussed. In the end, the
implementation of distributed deep recurrent neural network with
Hadoop is shown with Deeplearning4j.
Chapter 5 , Restricted Boltzmann Machines, covers both the models
discussed in chapters 3 and 4 and explains that they are
discriminative models. A generative model called Restricted
Boltzmann machine (RBM) is discussed in chapter 5. RBM is capable
of randomly producing visible data values when hidden parameters
are supplied to it. The chapter starts with introducing the concept of
an Energy-based model, and explains how Restricted Boltzmann
machines are related to it. Furthermore, the discussion progresses
towards a special type of RBM known as Convolutional Restricted
Boltzmann machine, which is a combination of both Convolution and
Restricted Boltzmann machines, and facilitates in the extraction of
the features of high dimensional images.
Deep Belief networks (DBN), a widely used multilayer network
composed of several Restricted Boltzmann machines gets introduced
in the latter part of the chapter. This part also discusses how DBN
can be implemented in a distributed environment using Hadoop. The
implementation of RBM as well as distributed DBN using
Deeplearning4j is discussed in the end of the chapter.
Chapter 6, Autoencoders, introduces one more generative model
called autoencoder, which is generally used for dimensionality
reduction, feature learning, or extraction. The chapter starts with
explaining the basic concept of autoencoder and its generic block
diagram. The core structure of an autoencoder is basically divided
into two parts, encoder and decoder. The encoder maps the input to
the hidden layer, whereas the decoder maps the hidden layer to the
output layer. The primary concern of a basic autoencoder is to copy
certain aspects of the input layer to the output layer. The next part
of the chapter discusses a type of autoencoder called sparse
autoencoder, which is based on the distributed sparse representation
of the hidden layer. Going further, the concept of deep autoencoder,
comprising multiple encoders and decoders is explained in-depth
with an appropriate example and block diagram. As we proceed,
23. denoising autoencoder and stacked denoising autoencoder are
explained in the latter part of the chapter. In conclusion, chapter 6
also shows the implementation of stacked denoising autoencoder
and deep autoencoder in Hadoop using Deeplearning4j.
Chapter 7 , Miscellaneous Deep Learning Operations using
Hadoop, focuses, mainly,on the design of three most commonly used
machine learning applications in distributed environment. The
chapter discusses the implementation of large-scale video
processing, large-scale image processing, and natural language
processing (NLP) with Hadoop. It explains how the large-scale video
and image datasets can be deployed in Hadoop Distributed File
System (HDFS) and processed with Map-reduce algorithm. For NLP,
an in-depth explanation of the design and implementation is
provided at the end of the chapter.
24. What you need for this book
We expect all the readers of this book to have some background on
computer science. This book mainly talks on different deep neural
networks, their designs and applications with Deeplearning4j. To
extract the most out of the book, the readers are expected to know
the basics of machine learning, linear algebra, probability theory, the
concepts of distributed systems and Hadoop. For the implementation
of deep neural networks with Hadoop, Deeplearning4j has been
extensively used throughout this book. Following is the link for
everything you need to run Deeplearning4j:
https://deeplearning4j.org/quickstart
25. Who this book is for
If you are a data scientist who wants to learn how to perform deep
learning on Hadoop, this is the book for you. Knowledge of the basic
machine learning concepts and some understanding of Hadoop is
required to make the best use of this book.
26. Conventions
In this book, you will find a number of text styles that distinguish
between different kinds of information. Here are some examples of
these styles and an explanation of their meaning.
Code words in text, database table names, folder names, filenames,
file extensions, pathnames, dummy URLs, user input, and Twitter
handles are shown as follows: "The .build() function is used to
build the layer."
A block of code is set as follows:
public static final String DATA_URL =
"http://ai.stanford.edu/~amaas/data/sentiment/*";
When we wish to draw your attention to a particular part of a code
block, the relevant lines or items are set in bold:
MultiLayerNetwork model = new
MultiLayerNetwork(getConfiguration());
Model.init();
New terms and important words are shown in bold. Words that
you see on the screen, for example, in menus or dialog boxes,
appear in the text like this: "In simple words, any neural network
with two or more layers (hidden) is defined as a deep feed-
forward network or feed-forward neural network."
Note
Warnings or important notes appear in a box like this.
Tip
Tips and tricks appear like this.
27. Reader feedback
Feedback from our readers is always welcome. Let us know what
you think about this book-what you liked or disliked. Reader
feedback is important for us as it helps us develop titles that you will
really get the most out of. To send us general feedback, simply e-
mail feedback@packtpub.com, and mention the book's title in the
subject of your message. If there is a topic that you have expertise
in and you are interested in either writing or contributing to a book,
see our author guide at www.packtpub.com/authors.
28. Customer support
Now that you are the proud owner of a Packt book, we have a
number of things to help you to get the most from your purchase.
Downloading the example code
You can download the example code files for this book from your
account at http://www.packtpub.com. If you purchased this book
elsewhere, you can visit http://www.packtpub.com/support and
register to have the files e-mailed directly to you.
You can download the code files by following these steps:
1. Log in or register to our website using your e-mail address and
password.
2. Hover the mouse pointer on the SUPPORT tab at the top.
3. Click on Code Downloads & Errata.
4. Enter the name of the book in the Search box.
5. Select the book for which you're looking to download the code
files.
6. Choose from the drop-down menu where you purchased this
book from.
7. Click on Code Download.
Once the file is downloaded, please make sure that you unzip or
extract the folder using the latest version of:
WinRAR / 7-Zip for Windows
Zipeg / iZip / UnRarX for Mac
7-Zip / PeaZip for Linux
The code bundle for the book is also hosted on GitHub at
https://github.com/PacktPublishing/Deep-Learning-with-Hadoop. We
also have other code bundles from our rich catalog of books and
videos available at https://github.com/PacktPublishing/. Check them
out!
Downloading the color images of
this book
We also provide you with a PDF file that has color images of the
screenshots/diagrams used in this book. The color images will help
29. you better understand the changes in the output. You can download
this file from
https://www.packtpub.com/sites/default/files/downloads/DeepLearni
ngwithHadoop_ColorImages.pdf.
Errata
Although we have taken every care to ensure the accuracy of our
content, mistakes do happen. If you find a mistake in one of our
books-maybe a mistake in the text or the code-we would be grateful
if you could report this to us. By doing so, you can save other
readers from frustration and help us improve subsequent versions of
this book. If you find any errata, please report them by visiting
http://www.packtpub.com/submit-errata, selecting your book,
clicking on the Errata Submission Form link, and entering the
details of your errata. Once your errata are verified, your submission
will be accepted and the errata will be uploaded to our website or
added to any list of existing errata under the Errata section of that
title.
To view the previously submitted errata, go to
https://www.packtpub.com/books/content/support and enter the
name of the book in the search field. The required information will
appear under the Errata section.
Piracy
Piracy of copyrighted material on the Internet is an ongoing problem
across all media. At Packt, we take the protection of our copyright
and licenses very seriously. If you come across any illegal copies of
our works in any form on the Internet, please provide us with the
location address or website name immediately so that we can
pursue a remedy.
Please contact us at copyright@packtpub.com with a link to the
suspected pirated material.
We appreciate your help in protecting our authors and our ability to
bring you valuable content.
Questions
30. If you have a problem with any aspect of this book, you can contact
us at questions@packtpub.com, and we will do our best to address
the problem.
31. Chapter 1. Introduction to
Deep Learning
"By far the greatest danger of Artificial Intelligence is that people
conclude too early that they understand it."
--Eliezer Yudkowsky
Ever thought, why it is often difficult to beat the computer in chess,
even for the best players of the game? How Facebook is able to
recognize your face amid hundreds of millions of photos? How can
your mobile phone recognize your voice, and redirect the call to the
correct person, from hundreds of contacts listed?
The primary goal of this book is to deal with many of those queries,
and to provide detailed solutions to the readers. This book can be
used for a wide range of reasons by a variety of readers, however,
we wrote the book with two main target audiences in mind. One of
the primary target audiences is undergraduate or graduate university
students learning about deep learning and Artificial Intelligence; the
second group of readers are the software engineers who already
have a knowledge of big data, deep learning, and statistical
modeling, but want to rapidly gain knowledge of how deep learning
can be used for big data and vice versa.
This chapter will mainly try to set a foundation for the readers by
providing the basic concepts, terminologies, characteristics, and the
major challenges of deep learning. The chapter will also put forward
the classification of different deep network algorithms, which have
been widely used by researchers over the last decade. The following
are the main topics that this chapter will cover:
Getting started with deep learning
Deep learning terminologies
Deep learning: A revolution in Artificial Intelligence
Classification of deep learning networks
Ever since the dawn of civilization, people have always dreamt of
building artificial machines or robots which can behave and work
exactly like human beings. From the Greek mythological characters
32. to the ancient Hindu epics, there are numerous such examples,
which clearly suggest people's interest and inclination towards
creating and having an artificial life.
During the initial computer generations, people had always
wondered if the computer could ever become as intelligent as a
human being! Going forward, even in medical science, the need of
automated machines has become indispensable and almost
unavoidable. With this need and constant research in the same field,
Artificial Intelligence (AI) has turned out to be a flourishing
technology with various applications in several domains, such as
image processing, video processing, and many other diagnosis tools
in medical science too.
Although there are many problems that are resolved by AI systems
on a daily basis, nobody knows the specific rules for how an AI
system is programmed! A few of the intuitive problems are as
follows:
Google search, which does a really good job of understanding
what you type or speak
As mentioned earlier, Facebook is also somewhat good at
recognizing your face, and hence, understanding your interests
Moreover, with the integration of various other fields, for example,
probability, linear algebra, statistics, machine learning, deep
learning, and so on, AI has already gained a huge amount of
popularity in the research field over the course of time.
One of the key reasons for the early success of AI could be that it
basically dealt with fundamental problems for which the computer
did not require a vast amount of knowledge. For example, in 1997,
IBM's Deep Blue chess-playing system was able to defeat the world
champion Garry Kasparov [1]. Although this kind of achievement at
that time can be considered significant, it was definitely not a
burdensome task to train the computer with only the limited number
of rules involved in chess! Training a system with a fixed and limited
number of rules is termed as hard-coded knowledge of the
computer. Many Artificial Intelligence projects have undergone this
hard-coded knowledge about the various aspects of the world in
many traditional languages. As time progresses, this hard-coded
33. knowledge does not seem to work with systems dealing with huge
amounts of data. Moreover, the number of rules that the data was
following also kept changing in a frequent manner. Therefore, most
of the projects following that system failed to stand up to the height
of expectation.
The setbacks faced by this hard-coded knowledge implied that those
artificial intelligence systems needed some way of generalizing
patterns and rules from the supplied raw data, without the need
for external spoon-feeding. The proficiency of a system to do so is
termed as machine learning. There are various successful machine
learning implementations which we use in our daily life. A few of the
most common and important implementations are as follows:
Spam detection: Given an e-mail in your inbox, the model can
detect whether to put that e-mail in spam or in the inbox folder.
A common naive Bayes model can distinguish between such e-
mails.
Credit card fraud detection: A model that can detect
whether a number of transactions performed at a specific time
interval are carried out by the original customer or not.
One of the most popular machine learning models, given by
Mor-Yosef et al in 1990, used logistic regression, which could
recommend whether caesarean delivery was needed for the
patient or not!
There are many such models which have been implemented with the
help of machine learning techniques.
Figure 1.1: The figure shows the example of different types of
representation. Let's say we want to train the machine to detect
34. some empty spaces in between the jelly beans. In the image on the
right side, we have sparse jelly beans, and it would be easier for the
AI system to determine the empty parts. However, in the image on
the left side, we have extremely compact jelly beans, and hence, it
will be an extremely difficult task for the machine to find the empty
spaces. Images sourced from USC-SIPI image database
A large portion of performance of the machine learning systems
depends on the data fed to the system. This is called representation
of the data. All the information related to the representation is called
the feature of the data. For example, if logistic regression is used to
detect a brain tumor in a patient, the AI system will not try to
diagnose the patient directly! Rather, the concerned doctor will
provide the necessary input to the systems according to the common
symptoms of that patient. The AI system will then match those
inputs with the already received past inputs which were used to train
the system.
Based on the predictive analysis of the system, it will provide its
decision regarding the disease. Although logistic regression can learn
and decide based on the features given, it cannot influence or
modify the way features are defined. Logistic regression is a type of
regression model where the dependent variable has a limited
number of possible values based on the independent variable, unlike
linear regression. So, for example, if that model was provided with a
caesarean patient's report instead of the brain tumor patient's
report, it would surely fail to predict the correct outcome, as the
given features would never match with the trained data.
These dependencies of the machine learning systems on the
representation of the data are not really unknown to us! In fact,
most of our computer theory performs better based on how the data
are represented. For example, the quality of a database is
considered based on how the schema is designed. The execution of
any database query, even on a thousand or a million lines of data,
becomes extremely fast if the table is indexed properly. Therefore,
the dependency of the data representation of the AI systems should
not surprise us.
35. There are many such examples in daily life too, where the
representation of the data decides our efficiency. To locate a person
amidst 20 people is obviously easier than to locate the same person
in a crowd of 500 people. A visual representation of two different
types of data representation is shown in the preceding Figure 1.1.
Therefore, if the AI systems are fed with the appropriate featured
data, even the hardest problems could be resolved. However,
collecting and feeding the desired data in the correct way to the
system has been a serious impediment for the computer
programmer.
There can be numerous real-time scenarios where extracting the
features could be a cumbersome task. Therefore, the way the data
are represented decides the prime factors in the intelligence of the
system.
Note
Finding cats amidst a group of humans and cats can be
extremely complicated if the features are not appropriate. We
know that cats have tails; therefore, we might like to detect the
presence of tails as a prominent feature. However, given the
different tail shapes and sizes, it is often difficult to describe
exactly how a tail will look like in terms of pixel values! Moreover,
tails could sometimes be confused with the hands of humans.
Also, overlapping of some objects could omit the presence of a
cat's tail, making the image even more complicated.
From all the above discussions, it can be concluded that the success
of AI systems depends mainly on how the data are represented.
Also, various representations can ensnare and cache the different
explanatory factors of all the disparities behind the data.
Representation learning is one of the most popular and widely
practiced learning approaches used to cope with these specific
problems. Learning the representations of the next layer from the
existing representation of data can be defined as representation
learning. Ideally, all representation learning algorithms have this
advantage of learning representations, which capture the underlying
36. factors, a subset that might be applicable for each particular sub-
task. A simple illustration is given in the following Figure 1.2:
Figure 1.2: The figure illustrates representation learning. The middle
layers are able to discover the explanatory factors (hidden layers, in
blue rectangular boxes). Some of the factors explain each task's
target, whereas some explain the inputs
However, dealing with extracting some high-level data and features
from a massive amount of raw data, which requires some sort of
human-level understanding, has shown its limitations. There can be
many such examples:
Differentiating the cry of two similar age babies.
Identifying the image of a cat's eye at both day and night time.
This becomes clumsy, because a cat's eyes glow at night unlike
during the daytime.
In all these preceding edge cases, representation learning does not
appear to behave exceptionally, and shows deterrent behavior.
Deep learning, a sub-field of machine learning, can rectify this
major problem of representation learning by building multiple levels
of representations or learning a hierarchy of features from a series
of other simple representations and features [2] [8].
37. Figure 1.3: The figure shows how a deep learning system can
represent the human image by identifying various combinations such
as corners and contours, which can be defined in terms of edges.
Image reprinted with permission from Ian Goodfellow, Yoshua
Bengio, and Aaron Courville, Deep Learning, published by The MIT
Press
The preceding Figure 1.3 shows an illustration of a deep learning
model. It is generally a cumbersome task for the computer to
decode the meaning of raw unstructured input data, as represented
by this image, as a collection of different pixel values. A mapping
function, which will convert the group of pixels to identify the image,
is ideally difficult to achieve. Also, to directly train the computer for
these kinds of mapping is almost insuperable. For these types of
tasks, deep learning resolves the difficulty by creating a series of
subsets of mappings to reach the desired output. Each subset of
mappings corresponds to a different set of layer of the model. The
input contains the variables that one can observe, and hence , are
represented in the visible layers. From the given input we can
38. incrementally extract the abstract features of the data. As these
values are not available or visible in the given data, these layers are
termed as hidden layers.
In the image, from the first layer of data, the edges can easily be
identified just by a comparative study of the neighboring pixels. The
second hidden layer can distinguish the corners and contours from
the first hidden layer's description of the edges. From this second
hidden layer, which describes the corners and contours, the third
hidden layer can identify the different parts of the specific objects.
Ultimately, the different objects present in the image can be
distinctly detected from the third layer.
Deep learning started its journey exclusively in 2006, Hinton et al.
in 2006[2]; also Bengio et al. in 2007[3] initially focused on the
MNIST digit classification problem. In the last few years, deep
learning has seen major transitions from digits to object recognition
in natural images. Apart from this, one of the major breakthroughs
was achieved by Krizhevsky et al. in 2012 [4] using the ImageNet
dataset.
The scope of this book is mainly limited to deep learning, so before
diving into it directly, the necessary definitions of deep learning
should be discussed.
Many researchers have defined deep learning in many ways, and
hence, in the last 10 years, it has gone through many definitions
too! The following are few of the widely accepted definitions:
As noted by GitHub, deep learning is a new area of machine
learning research, which has been introduced with the objective
of moving machine learning closer to one of its original goals:
Artificial Intelligence. Deep learning is about learning multiple
levels of representation and abstraction, which help to make
sense of data such as images, sounds, and texts.
As recently updated by Wikipedia, deep learning is a branch of
machine learning based on a set of algorithms that attempt to
model high-level abstractions in the data by using a deep graph
with multiple processing layers, composed of multiple linear and
non-linear transformations.
39. As the definitions suggest, deep learning can also be considered as a
special type of machine learning. Deep learning has achieved
immense popularity in the field of data science with its ability to
learn complex representation from various simple features. To have
an in-depth grip on deep learning, we have listed out a few
terminologies which will be frequently used in the upcoming
chapters. The next topic of this chapter will help you to lay a
foundation for deep learning by providing various terminologies and
important networks used for deep learning.
Getting started with deep
learning
To understand the journey of deep learning in this book, one must
know all the terminologies and basic concepts of machine learning.
However, if you already have enough insight into machine learning
and related terms, you should feel free to ignore this section and
jump to the next topic of this chapter. Readers who are enthusiastic
about data science, and want to learn machine learning thoroughly,
can follow Machine Learning by Tom M. Mitchell (1997) [5] and
Machine Learning: a Probabilistic Perspective (2012) [6].
Note
Neural networks do not perform miracles. But, used sensibly,
they can produce some amazing results.
Deep feed-forward networks
Neural networks can be recurrent as well as feed-forward. Feed-
forward networks do not have any loop associated in their graph,
and are arranged in a set of layers. A network with many layers is
said to be a deep network. In simple words, any neural network with
two or more layers (hidden) is defined as a deep feed-forward
network or feed-forward neural network. Figure 1.4 shows a
generic representation of a deep feed-forward neural network.
Deep feed-forward networks work on the principle that with an
increase in depth, the network can also execute more sequential
40. instructions. Instructions in sequence can offer great power, as these
instructions can point to the earlier instruction.
The aim of a feed-forward network is to generalize some function f.
For example, classifier y=f(x) maps from input x to category y. A
deep feed-forward network modified the mapping, y=f(x; α), and
learns the value of the parameter α, which gives the most
appropriate value of the function. The following Figure 1.4 shows a
simple representation of the deep-forward network, to provide the
architectural difference with the traditional neural network.
Note
A deep neural network is a feed-forward network with many
hidden layers.
Figure 1.4: Figure shows the representation of a shallow and deep
feed-forward network
Various learning algorithms
Datasets are considered to be the building blocks of a learning
process. A dataset can be defined as a collection of interrelated sets
of data, which is comprised of separate entities, but which can be
used as a single entity depending on the use-case. The individual
data elements of a dataset are called data points.
The following Figure 1.5 gives the visual representation of the
various data points collected from a social network analysis:
41. Figure 1.5: Image shows the scattered data points of social network
analysis. Image sourced from Wikipedia
Unlabeled data: This part of data consists of the human-
generated objects, which can be easily obtained from the
surroundings. Some of the examples are X-rays, log file data,
news articles, speech, videos, tweets, and so on.
Labelled data: Labelled data are normalized data from a set of
unlabeled data. These types of data are usually well formatted,
classified, tagged, and easily understandable by human beings
for further processing.
From the top-level understanding, machine learning techniques can
be classified as supervised and unsupervised learning, based on how
their learning process is carried out.
Unsupervised learning
In unsupervised learning algorithms, there is no desired output from
the given input datasets. The system learns meaningful properties
42. and features from its experience during the analysis of the dataset.
In deep learning, the system generally tries to learn from the whole
probability distribution of the data points. There are various types of
unsupervised learning algorithms, which perform clustering. To
explain in simple words, clustering means separating the data points
among clusters of similar types of data. However, with this type of
learning, there is no feedback based on the final output, that is,
there won't be any teacher to correct you! Figure 1.6 shows a basic
overview of unsupervised clustering:
Figure 1.6: Figures shows a simple representation of unsupervised
clustering
A real life example of an unsupervised clustering algorithm is Google
News. When we open a topic under Google News, it shows us a
number of hyper-links redirecting to several pages. Each of these
topics can be considered as a cluster of hyper-links that point to
independent links.
Supervised learning
In supervised learning, unlike unsupervised learning, there is an
expected output associated with every step of the experience. The
system is given a dataset, and it already knows how the desired
output will look, along with the correct relationship between the
input and output of every associated layer. This type of learning is
often used for classification problems.
The following visual representation is given in Figure 1.7:
43. Figure 1.7: Figure shows the classification of data based on
supervised learning
Real-life examples of supervised learning include face detection, face
recognition, and so on.
Although supervised and unsupervised learning look like different
identities, they are often connected to each other by various means.
Hence, the fine line between these two learnings is often hazy to the
student fraternity.
The preceding statement can be formulated with the following
mathematical expression:
The general product rule of probability states that for an n number
of datasets n ε ℝt
,the joint distribution can be fragmented as follows:
The distribution signifies that the appeared unsupervised problem
can be resolved by t number of supervised problems. Apart from
this, the conditional probability of p (k | n), which is a supervised
problem, can be solved using unsupervised learning algorithms to
experience the joint distribution of p (n, k).
44. Although these two types are not completely separate identities,
they often help to classify the machine learning and deep learning
algorithms based on the operations performed. In generic terms,
cluster formation, identifying the density of a population based on
similarity, and so on are termed as unsupervised learning, whereas
structured formatted output, regression, classification, and so on are
recognized as supervised learning.
Semi-supervised learning
As the name suggests, in this type of learning both labelled and
unlabeled data are used during the training. It's a class of supervised
learning which uses a vast amount of unlabeled data during training.
For example, semi-supervised learning is used in a Deep belief
network (explained later), a type of deep network where some
layers learn the structure of the data (unsupervised), whereas one
layer learns how to classify the data (supervised learning).
In semi-supervised learning, unlabeled data from p (n) and labelled
data from p (n, k) are used to predict the probability of k, given the
probability of n, or p (k | n).
45. Figure 1.8: Figure shows the impact of a large amount of unlabelled
data during the semi-supervised learning technique. Figure obtained
from Wikipedia
In the preceding Figure 1.8, at the top it shows the decision
boundary that the model uses after distinguishing the white and
black circles. The figure at the bottom displays another decision
boundary, which the model embraces. In that dataset, in addition to
two different categories of circles, a collection of unlabeled data
(grey circle) is also annexed. This type of training can be viewed as
creating the cluster, and then marking those with the labelled data,
which moves the decision boundary away from the high-density data
region.
The preceding Figure 1.8 depicts the illustration of semi-supervised
learning. You can refer to Chapelle et al.'s book [7] to know more
about semi-supervised learning methods.
So, as you have already got a foundation in what Artificial
Intelligence, machine learning, and representation learning are, we
can now move our entire focus to elaborate on deep learning with
further description.
From the previously mentioned definitions of deep learning, two
major characteristics of deep learning can be pointed out, as follows:
A way of experiencing unsupervised and supervised learning of
the feature representation through successive knowledge from
subsequent abstract layers
A model comprising of multiple abstract stages of non-linear
information processing
46. Deep learning terminologies
Deep Neural Network (DNN): This can be defined as a
multilayer perceptron with many hidden layers. All the weights
of the layers are fully connected to each other, and receive
connections from the previous layer. The weights are initialized
with either supervised or unsupervised learning.
Recurrent Neural Networks (RNN): RNN is a kind of deep
learning network that is specially used in learning from time
series or sequential data, such as speech, video, and so on. The
primary concept of RNN is that the observations from the
previous state need to be retained for the next state. The recent
hot topic in deep learning with RNN is Long short-term
memory (LSTM).
Deep belief network (DBN): This type of network [9] [10]
[11] can be defined as a probabilistic generative model with
visible and multiple layers of latent variables (hidden). Each
hidden layer possesses a statistical relationship between units in
the lower layer through learning. The more the networks tend
to move to higher layers, the more complex relationship
becomes. This type of network can be productively trained using
greedy layer-wise training, where all the hidden layers are
trained one at a time in a bottom-up fashion.
Boltzmann machine (BM): This can be defined as a network
that is a symmetrically connected, neuron-like unit, which is
capable of taking stochastic decisions about whether to remain
on or off. BMs generally have a simple learning algorithm, which
allows them to uncover many interesting features that represent
complex regularities in the training dataset.
Restricted Boltzmann machine (RBM): RBM, which is a
generative stochastic Artificial Neural Network, is a special type
of Boltzmann Machine. These types of networks have the
capability to learn a probability distribution over a collection of
datasets. An RBM consists of a layer of visible and hidden units,
but with no visible-visible or hidden-hidden connections.
47. Convolutional neural networks: Convolutional neural
networks are part of neural networks; the layers are sparsely
connected to each other and to the input layer. Each neuron of
the subsequent layer is responsible for only a part of the input.
Deep convolutional neural networks have accomplished some
unmatched performance in the field of location recognition,
image classification, face recognition, and so on.
Deep auto-encoder: A deep auto-encoder is a type of auto-
encoder that has multiple hidden layers. This type of network
can be pre-trained as a stack of single-layered auto-encoders.
The training process is usually difficult: first, we need to train
the first hidden layer to restructure the input data, which is then
used to train the next hidden layer to restructure the states of
the previous hidden layer, and so on.
Gradient descent (GD): This is an optimization algorithm
used widely in machine learning to determine the coefficient of
a function (f), which reduces the overall cost function. Gradient
descent is mostly used when it is not possible to calculate the
desired parameter analytically (for example, linear algebra), and
must be found by some optimization algorithm.
In gradient descent, weights of the model are incrementally updated
with every single iteration of the training dataset (epoch).
The cost function, J (w), with the sum of the squared errors can be
written as follows:
The direction of magnitude of the weight update is calculated by
taking a step in the reverse direction of the cost gradient, as follows:
In the preceding equation, η is the learning rate of the network.
Weights are updated incrementally after every epoch with the
48. following rule:
for one or more epochs,
for each weight i,
wi:= w + ∆wi
end
end
Popular examples that can be optimized using gradient descent are
Logistic Regression and Linear Regression.
Stochastic Gradient Descent (SGD): Various deep learning
algorithms, which operated on a large amount of datasets, are
based on an optimization algorithm called stochastic gradient
descent. Gradient descent performs well only in the case of
small datasets. However, in the case of very large-scale
datasets, this approach becomes extremely costly . In gradient
descent, it takes only one single step for one pass over the
entire training dataset; thus, as the dataset's size tends to
increase, the whole algorithm eventually slows down. The
weights are updated at a very slow rate; hence, the time it
takes to converge to the global cost minimum becomes
protracted.
Therefore, to deal with such large-scale datasets, a variation of
gradient descent called stochastic gradient descent is used. Unlike
gradient descent, the weight is updated after each iteration of the
training dataset, rather than at the end of the entire dataset.
until cost minimum is reached
for each training sample j:
for each weight i
wi:= w + ∆wi
end
end
end
49. In the last few years, deep learning has gained tremendous
popularity, as it has become a junction for research areas of many
widely practiced subjects, such as pattern recognition, neural
networks, graphical modelling, machine learning, and signal
processing.
The other important reasons for this popularity can be summarized
by the following points:
In recent years, the ability of GPU (Graphical Processing
Units) has increased drastically
The size of data sizes of the dataset used for training purposes
has increased significantly
Recent research in machine learning, data science, and
information processing has shown some serious advancements
Detailed descriptions of all these points will be provided in an
upcoming topic in this chapter.
51. THE TALES OF THE
HEPTAMERON
OF MARGARET, QUEEN OF
NAVARRE
IN FIVE VOLUMES
52. VOLUME THE FOURTH
CONTENTS
FOURTH DAY.
PROLOGUE.
TALE XXXI.
TALE XXXII.
TALE XXXIII.
TALE XXXIV.
TALE XXXV.
TALE XXXVI.
TALE XXXVII.
TALE XXXVIII.
TALE XXXIX.
TALE XL.
FIFTH DAY.
PROLOGUE.
TALE XLI.
TALE XLII.
TALE XLIII.
TALE XLIV.(A).
TALE XLIV. (B).
TALE XLV.
TALE XLVI. (A).
TALE XLVI.(B).
TALE XLVII.
TALE XLVIII.
53. TALE XLIX.
TALE L.
APPENDIX.
A. (Tale XXXVI., Page 63.)
ILLUSTRATIONS
Frontispiece
Titlepage
007a.jpg the Wicked Friar Captured
007.jpg Page Image
0016.jpg Tailpiece
017a.jpg Bernage Observing the German Lady’s Strange Penance
017.jpg Page Image
028.jpg Tailpiece
029a.jpg the Execution of The Wicked Priest and his Sister
029.jpg Page Image
037.jpg Tailpiece
039a.jpg the Grey Friar Imploring The Butcher to Spare his Life
039.jpg Page Image
047.jpg Tailpiece
049a.jpg the Lady Embracing The Supposed Friar
049.jpg Page Image
062.jpg Tailpiece
063a.jpg the Clerk Entreating Forgiveness of The President
063.jpg Page Image
072.jpg Tailpiece
073a.jpg the Lady of Loué Bringing Her Husband The Basin Of
Water
073.jpg Page Image
54. 081.jpg Tailpiece
083a.jpg the Lady of Tours Questioning Her Husband’s Mistress
083.jpg Page Image
088.jpg Tailpiece
089a.jpg the Lord of Grignaulx Catching The Pretended Ghost
089.jpg Page Image
094.jpg Tailpiece
095a.jpg the Count of Jossebelin Murdering his Sister’s Husband
095.jpg Page Image
109.jpg Tailpiece
115a.jpg the Beating of The Wicked Grey Friar
115.jpg Page Image
122.jpg Tailpiece
123a.jpg the Girl Refusing The Gift of The Young Prince
123.jpg Page Image
142.jpg Tailpiece
143a.jpg Jambicque Repudiating Her Lover
143.jpg Page Image
155.jpg Tailpiece
157.jpg Page Image
162.jpg Tailpiece
163a.jpg the Lovers Returning from Their Meeting in The Garden
163.jpg Page Image
176.jpg Tailpiece
177a.jpg the Man of Tours and his Serving-maid in The Snow
177.jpg Page Image
186.jpg Tailpiece
187.jpg Page Image
193.jpg Tailpiece
195a.jpg the Young Man Beating his Wife
55. 195.jpg Page Image
201.jpg Tailpiece
203a.jpg the Gentleman Reproaching his Friend for His Jealousy
203.jpg Page Image
211.jpg Tailpiece
213a.jpg the Grey Friars Caught and Punished
213.jpg Page Image
218.jpg Tailpiece
219a.jpg the Countess Facing Her Lovers
219.jpg Page Image
232.jpg Tailpiece
233a.jpg the Lady Killing Herself on The Death of Her Lover
233.jpg Page Image
240.jpg Tailpiece
DETAILED CONTENTS OF VOLUME IV.
FOURTH DAY.
Prologue
Tale XXXI. Punishment of the wickedness of a Friar who sought to
lie
with a gentleman’s wife.
Tale XXXII. How an ambassador of Charles VIII., moved by the
repentance
of a German lady, whom her husband compelled to drink out of her
lover’s
skull, reconciled husband and wife together.
Tale XXXIII. The hypocrisy of a priest who, under the cloak of
sanctity,
had lain with his own sister, is discovered and punished by the
56. wisdom
of the Count of Angoulême.
Tale XXXIV. The terror of two Friars who believed that a butcher
intended to murder them, whereas the poor man was only
speaking of his
Pigs.
Tale XXXV. How a husband’s prudence saves his wife from the risks
she
incurred while thinking to yield to merely a spiritual love.
Tale XXXVI. The story of the President of Grenoble, who saves the
honour
of his house by poisoning his wife with a salad.
Tale XXXVII. How the Lady of Loué regained her husband’s
affection.
Tale XXXVIII. The kindness of a townswoman of Tours to a poor
farm-woman who is mistress to her husband, makes the latter so
ashamed
of his faithlessness that he returns to his wife.
Tale XXXIX. How the Lord of Grignaulx rid one of his houses of a
pretended ghost.
Tale XL. The unhappy history of the Count de Jossebelin’s sister,
who
shut herself up in a hermitage because her brother caused her
husband to
be slain.
FIFTH DAY.
Prologue
Tale XLI. Just punishment of a Grey Friar for the unwonted
penance that
he would have laid upon a maiden.
57. Tale XLII. The virtuous resistance made by a young woman of
Touraine
causes a young Prince that is in love with her, to change his desire
to
respect, and to bestow her honourably in marriage.
Tale XLIII. How a little chalk-mark revealed the hypocrisy of a lady
called Jambicque, who was wont to hide the pleasures she
indulged in,
beneath the semblance of austerity.
Tale XLIV. (A). Through telling the truth, a Grey Friar receives as
alms
from the Lord of Sedan two pigs instead of one.
Tale XLIV. (B). Honourable conduct of a young citizen of Paris, who,
after suddenly enjoying his sweetheart, at last happily marries.
Tale XLV. Cleverness of an upholsterer of Touraine, who, to hide
that
he has given the Innocents to his serving-maid, contrives to give
them
afterwards to his wife.
Tale XLVI. (A). Wicked acts of a Grey Friar of Angoulême called De
Vale,
who fails in his purpose with the wife of the Judge of the Exempts,
but
to whom a mother in blind confidence foolishly abandons her
daughter.
Tale XLVI. (B). Sermons of the Grey Friar De Vallès, at first against
and afterwards on behalf of husbands that beat their wives.
Tale XLVII. The undeserved jealousy of a gentleman of Le Perche
towards
another gentleman, his friend, leads the latter to deceive him.
Tale XLVIII. Wicked act of a Grey Friar of Perigord, who, while a
husband was dancing at his wedding, went and took his place with
58. the
bride.
Tale XLIX. Story of a foreign Countess, who, not content with
having
King Charles as her lover, added to him three lords, to wit, Astillon,
Durassier and Valnebon.
Tale L. Melancholy fortune of Messire John Peter, a gentleman of
Cremona, who dies just when he is winning the affection of the
lady he
loves.
Appendix to Vol. IV.
THE TALES OF THE
HEPTAMERON OF
MARGARET, QUEEN OF
NAVARRE
IN FIVE VOLUMES
59. VOLUME THE FIFTH
CONTENTS
SIXTH DAY.
PROLOGUE.
TALE LI.
TALE LII.
TALE LIII.
TALE LIV.
TALE LV.
TALE LVI.
TALE LVII.
TALE LVIII.
TALE LIX.
TALE LX.
SEVENTH DAY.
PROLOGUE.
TALE LXI.
TALE LXII.
TALE LXIII.
TALE LXIV.
TALE LXV.
TALE LXVI.
TALE LXVII.
60. TALE LXVIII.
TALE LXIX.
TALE LXX.
EIGHTH DAY.
PROLOGUE.
TALE LXXI.
TALE LXXII.
APPENDIX.
THE SUPPOSED NARRATORS OF THE HEPTAMERON TALES.
BIBLIOGRAPHY.
ILLUSTRATIONS
Frontispiece
Titlepage
005a.jpg the Duke of Urbino Sending The Maiden to Prison for
Carrying Messages
005.jpg Page Image
014.jpg Tailpiece
015a.jpg the Gentleman and his Friend Annoyed by The Smell of
Sugar
015.jpg Page Image
022.jpg Tailpiece
023a.jpg the Lord Des Cheriots Flying from The Prince’s Servant
023.jpg Page Image
036.jpg Tailpiece
037a.jpg the Lady Watching The Shadow Faces Kissing
037.jpg Page Image
042.jpg Tailpiece
61. 043a.jpg the Servant Selling The Horse With The Cat
043.jpg Page Image
049.jpg Tailpiece
051a.jpg the Grey Friar Introducing his Comrade to The Lady and
Her Daughter
051.jpg Page Image
061.jpg Tailpiece
063a.jpg the English Lord Seizing The Lady’s Glove
063.jpg Page Image
070.jpg Tailpiece
071a. The Gentleman Mocked by The Ladies when Returning from
The False Tryst
071.jpg Page Image
078.jpg Tailpiece
079a. The Lady Discovering Her Husband With The Waiting-woman
079.jpg Page Image
090.jpg Tailpiece
091a. The Chanter of Blois Delivering his Mistress from The Grave
091.jpg Page Image
099.jpg Tailpiece
105a. The Lady Returning to Her Lover, The Canon of Autun
105.jpg Page Image
117.jpg Tailpiece
119a. The Gentleman’s Spur Catching in The Sheet
119.jpg Page Image
124.jpg Tailpiece
125a. The King Asking The Young Lord to Join his Banquet
125.jpg Page Image
132.jpg Tailpiece
133a. The Lady Swooning in The Arms of The Gentleman Of
Valencia Who Had Become a Monk
62. 133.jpg Page Image
141.jpg Tailpiece
143a. The Old Woman Startled by The Waking of The Soldier
143.jpg Page Image
147.jpg Tailpiece
149a. The Old Serving-woman Explaining Her Mistake To The Duke
and Duchess of Vendôme
149.jpg Page Image
154.jpg Tailpiece
155a. The Wife Reading to Her Husband on The Desert Island
155.jpg Page Image
161.jpg Tailpiece
163a. The Apothecary’s Wife Giving The Dose of Cantharides To
Her Husband
163.jpg Page Image
168.jpg Tailpiece
169a. The Wife Discovering Her Husband in The Hood Of Their
Serving-maid
169.jpg Page Image
174.jpg Tailpiece
175a. The Gentleman Killing Himself on The Death of his Mistress
175.jpg Page Image
213.jpg Tailpiece
219a. The Saddler’s Wife Cured by The Sight of Her Husband
Caressing the Serving-maid
219.jpg Page Image
224.jpg Tailpiece
225a. The Monk Conversing With The Nun While Shrouding A Dead
Body
225.jpg Page Image
232.jpg Tailpiece
63. DETAILED CONTENTS OF VOLUME V.
SIXTH DAY.
Prologue
Tale LI. Cruelty of the Duke of Urbino, who, contrary to the
promise he had given to the Duchess, hanged a poor lady that had
consented to convey letters to his son’s sweetheart, the sister of
the Abbot of Farse.
Tale LII. Merry trick played by the varlet of an apothecary at
Alençon on the Lord de la Tirelière and the lawyer Anthony
Bacheré, who, thinking to breakfast at his expense, find that they
have stolen from him something very different to a loaf of sugar.
Tale LIII. Story of the Lady of Neufchâtel, a widow at the Court of
Francis I., who, through not admitting that she has plighted her
troth to the Lord des Cheriots, plays him an evil trick through the
means of the Prince of Belhoste.
Tale LIV. Merry adventure of a serving-woman and a gentleman
named Thogas, whereof his wife has no suspicion.
Tale LV. The widow of a merchant of Saragossa, not wishing to lose
the value of a horse, the price of which her husband had ordered
to be given to the poor, devises the plan of selling the horse for
one ducat only, adding, however, to the bargain a cat at ninety-
nine.
Tale LVI. Notable deception practised by an old Grey Friar of
Padua, who, being charged by a widow to find a husband for her
daughter, did, for the sake of getting the dowry, cause her to marry
a young Grey Friar, his comrade, whose condition, however, was
before long discovered.
Tale LVII. Singular behaviour of an English lord, who is content
merely to keep and wear upon his doublet the glove of a lady
whom he loves.
64. Tale LVIII. A lady at the Court of Francis I., wishing to prove that
she has no commerce with a certain gentleman who loves her,
gives him a pretended tryst and causes him to pass for a thief.
Tale LIX. Story of the same lady, who, learning that her husband is
in love with her waiting-woman, contrives to surprise him and
impose her own terms upon him.
Tale LX. A man of Paris, thinking his wife to be well and duly
deceased, marries again, but at the end of fifteen years is forced to
take his first wife back, although she has been living meantime
with one of the chanters of Louis XII.
SEVENTH DAY.
Prologue
Tale LXI. Great kindness of a husband, who consents to take back
his wife twice over, spite of her wanton love for a Canon of Autun.
Tale LXII. How a lady, while telling a story as of another, let her
tongue trip in such a way as to show that what she related had
happened to herself.
Tale LXIII. How the honourable behaviour of a young lord, who
feigns sickness in order to be faithful to his wife, spoils a party in
which he was to have made one with the King, and in this way
saves the honour of three maidens of Paris.
Tale LXIV. Story of a gentleman of Valencia in Spain, whom a lady
drove to such despair that he became a monk, and whom
afterwards she strove in vain to win back to herself.
Tale LXV. Merry mistake of a worthy woman, who in the church of
St. John of Lyons mistakes a sleeping soldier for one of the statues
on a tomb, and sets a lighted candle on his forehead.
Tale LXVI. How an old serving-woman, thinking to surprise a
Prothonotary with a lady, finds herself insulting Anthony de
Bourbon and his wife Jane d’Albret.
65. Tale LXVII. How the Sire de Robertval, granting a traitor his life at
the prayers of the man’s wife, set them both down on a desert
island, and how, after the husband’s death, the wife was rescued
and brought back to La Rochelle.
Tale LXVIII. The wife of an apothecary at Pau, hearing her
husband give some powder of cantharides to a woman who was
godmother with himself, secretly administered to him such a dose
of the same drug that he nearly died.
Tale LXIX. How the wife of one of the King’s Equerries surprised
her husband muffled in the hood of their servant-maid, and bolting
meal in her stead.
Tale LXX. Of the love of a Duchess of Burgundy for a gentleman
who rejects her advances, for which reason she accuses him to the
Duke her husband, and the latter does not believe his oaths till
assured by him that he loves the Lady du Vergier. Then the
Duchess, having drawn knowledge of this amour from her
husband, addresses to the Lady du Vergier in public, an allusion
that causes the death of both lovers; and the Duke, in despair at
his own lack of discretion, stabs the Duchess himself.
EIGHTH DAY.
Prologue
Tale LXXI. The wife of a saddler of Amboise is saved on her
deathbed through a fit of anger at seeing her husband fondle a
servant-maid.
Tale LXXII. Kindness of the Duchess of Alençon to a poor nun
whom she meets at Lyons, on her way to Rome, there to confess
to the Pope how a monk had wronged her, and to obtain his
Holiness’s pardon.
66. THE TALES OF THE
HEPTAMERON
OF MARGARET, QUEEN OF
NAVARRE
IN FIVE VOLUMES
DETAILED CONTENTS
OF THE TALES OF ALL FIVE
VOLUMES
Tale I. The pitiful history of a Proctor of Alençon, named St.
Aignan,
and of his wife, who caused her husband to assassinate her lover,
the
son of the Lieutenant-General
Tale II. The fate of the wife of a muleteer of Amboise, who
suffered herself
to be killed by her servant rather than sacrifice her chastity
Tale III. The revenge taken by the Queen of Naples, wife to King
Alfonso, for
her husband’s infidelity with a gentleman’s wife
Tale IV. The ill success of a Flemish gentleman who was unable to
obtain,
67. either by persuasion or force, the love of a great Princess
Tale V. How a boatwoman of Coulon, near Nyort, contrived to
escape from the
vicious designs of two Grey Friars
Tale VI. How the wife of an old valet of the Duke of Alençon’s
succeeded
in saving her lover from her husband, who was blind of one eye
Tale VII. The craft of a Parisian merchant, who saved the
reputation of the
daughter by offering violence to the mother
FIRST DAY—Continued.
Tale VIII. The misadventure of Bornet, who, planning with a friend
of
his that both should lie with a serving-woman, discovers too late
that
they have had to do with his own wife.
Tale IX. The evil fortune of a gentleman of Dauphiné, who dies of
despair because he cannot marry a damsel nobler and richer than
himself.
Tale X. The Spanish story of Florida, who, after withstanding the
love
of a gentleman named Amadour for many years, eventually
becomes a nun.
SECOND DAY.
Prologue
Tale XI. (A). Mishap of the Lady de Roncex in the Grey Friars’
Convent
at Thouars.
68. Tale XI. (B). Facetious discourse of a Friar of Touraine.
Tale XII. Story of Alexander de’ Medici, Duke of Florence, whom his
cousin, Lorenzino de’ Medici, slew in order to save his sister’s
honour.
Tale XIII. Praiseworthy artifice of a lady to whom a sea Captain
sent
a letter and diamond ring, and who, by forwarding them to the
Captain’s
wife as though they had been intended for her, united husband
and wife
once more in all affection.
Tale XIV. The Lord of Bonnivet, after furthering the love
entertained
by an Italian gentleman for a lady of Milan, finds means to take
the other’s place and so supplant him with the lady who had
formerly
rejected himself.
Tale XV. The troubles and evil fortune of a virtuous lady who, after
being long neglected by her husband, becomes the object of his
jealousy.
Tale XVI. Story of a Milanese Countess, who, after long rejecting
the
love of a French gentleman, rewards him at last for his faithfulness,
but not until she has put his courage to the proof.
Tale XVII. The noble manner in which King Francis the First shows
Count
William of Furstemberg that he knows of the plans laid by him
against
his life, and so compels him to do justice upon himself and to leave
France.
Tale XVIII. A young gentleman scholar at last wins a lady’s love,
after
enduring successfully two trials that she had made of him.
Appendix to Vol. II
69. SECOND DAY—Continued.
Tale XIX. The honourable love of a gentleman, who, when his
sweetheart
is forbidden to speak with him, in despair becomes a monk of the
Observance, while the lady, following in his footsteps, becomes a
nun of
St. Clara
Tale XX. How the Lord of Riant is cured of his love fora beautiful
widow
through surprising her in the arms of a groom
THIRD DAY.
Prologue
Tale XXI. The affecting history of Rolandine, who, debarred from
marriage by her father’s greed, betrothes herself to a gentleman to
whom, despite his faithlessness, she keeps her plighted word, and
does
not marry until after his death
Tale XXII. How Sister Marie Heroet virtuously escapes the attempts
of
the Prior of St. Martin in-the-Fields
Tale XXIII. The undeserved confidence which a gentleman of
Perigord
places in the monks of the Order of St. Francis, causes the death
of
himself, his wife and their little child
Tale XXIV. Concerning the unavailing love borne to the Queen of
Castile
by a gentleman named Elisor, who in the end becomes a hermit
Tale XXV. How a young Prince found means to conceal his intrigue
with
70. the wife of a lawyer of Paris
Tale XXVI. How the counsels of a discreet lady happily withdrew
the
young Lord of Avannes from the perils of his foolish love for a lady
of
Pampeluna
Tale XXVII. How the wife of a man who was valet to a Princess rid
herself of the solicitations of one who was among the same
Princess’s
servants, and at the same time her husband’s guest
Tale XXVIII. How a Gascon merchant, named Bernard du Ha, while
sojourning at Paris, deceived a Secretary to the Queen of Navarre
who
had thought to obtain a pasty from him
Tale XXIX. How the Priest of Carrelles, in Maine, when surprised
with
the wife of an old husbandman, gets out of the difficulty by
pretending
to return him a winnowing fan
Tale XXX. How a gentleman marries his own daughter and sister
unawares
FOURTH DAY.
Prologue
Tale XXXI. Punishment of the wickedness of a Friar who sought to
lie
with a gentleman’s wife.
Tale XXXII. How an ambassador of Charles VIII., moved by the
repentance
of a German lady, whom her husband compelled to drink out of her
lover’s
skull, reconciled husband and wife together.
71. Tale XXXIII. The hypocrisy of a priest who, under the cloak of
sanctity,
had lain with his own sister, is discovered and punished by the
wisdom
of the Count of Angoulême.
Tale XXXIV. The terror of two Friars who believed that a butcher
intended to murder them, whereas the poor man was only
speaking of his
Pigs.
Tale XXXV. How a husband’s prudence saves his wife from the risks
she
incurred while thinking to yield to merely a spiritual love.
Tale XXXVI. The story of the President of Grenoble, who saves the
honour
of his house by poisoning his wife with a salad.
Tale XXXVII. How the Lady of Loué regained her husband’s
affection.
Tale XXXVIII. The kindness of a townswoman of Tours to a poor
farm-woman who is mistress to her husband, makes the latter so
ashamed
of his faithlessness that he returns to his wife.
Tale XXXIX. How the Lord of Grignaulx rid one of his houses of a
pretended ghost.
Tale XL. The unhappy history of the Count de Jossebelin’s sister,
who
shut herself up in a hermitage because her brother caused her
husband to
be slain.
FIFTH DAY.
Prologue
72. Tale XLI. Just punishment of a Grey Friar for the unwonted
penance that
he would have laid upon a maiden.
Tale XLII. The virtuous resistance made by a young woman of
Touraine
causes a young Prince that is in love with her, to change his desire
to
respect, and to bestow her honourably in marriage.
Tale XLIII. How a little chalk-mark revealed the hypocrisy of a lady
called Jambicque, who was wont to hide the pleasures she
indulged in,
beneath the semblance of austerity.
Tale XLIV. (A). Through telling the truth, a Grey Friar receives as
alms
from the Lord of Sedan two pigs instead of one.
Tale XLIV. (B). Honourable conduct of a young citizen of Paris, who,
after suddenly enjoying his sweetheart, at last happily marries.
Tale XLV. Cleverness of an upholsterer of Touraine, who, to hide
that
he has given the Innocents to his serving-maid, contrives to give
them
afterwards to his wife.
Tale XLVI. (A). Wicked acts of a Grey Friar of Angoulême called De
Vale,
who fails in his purpose with the wife of the Judge of the Exempts,
but
to whom a mother in blind confidence foolishly abandons her
daughter.
Tale XLVI. (B). Sermons of the Grey Friar De Vallès, at first against
and afterwards on behalf of husbands that beat their wives.
Tale XLVII. The undeserved jealousy of a gentleman of Le Perche
towards
73. another gentleman, his friend, leads the latter to deceive him.
Tale XLVIII. Wicked act of a Grey Friar of Perigord, who, while a
husband was dancing at his wedding, went and took his place with
the
bride.
Tale XLIX. Story of a foreign Countess, who, not content with
having
King Charles as her lover, added to him three lords, to wit, Astillon,
Durassier and Valnebon.
Tale L. Melancholy fortune of Messire John Peter, a gentleman of
Cremona, who dies just when he is winning the affection of the
lady he
loves.
Appendix to Vol. IV.
SIXTH DAY.
Prologue
Tale LI. Cruelty of the Duke of Urbino, who, contrary to the
promise he had given to the Duchess, hanged a poor lady that had
consented to convey letters to his son’s sweetheart, the sister of
the Abbot of Farse.
Tale LII. Merry trick played by the varlet of an apothecary at
Alençon on the Lord de la Tirelière and the lawyer Anthony
Bacheré, who, thinking to breakfast at his expense, find that they
have stolen from him something very different to a loaf of sugar.
Tale LIII. Story of the Lady of Neufchâtel, a widow at the Court of
Francis I., who, through not admitting that she has plighted her
troth to the Lord des Cheriots, plays him an evil trick through the
means of the Prince of Belhoste.
Tale LIV. Merry adventure of a serving-woman and a gentleman
named Thogas, whereof his wife has no suspicion.
74. Tale LV. The widow of a merchant of Saragossa, not wishing to lose
the value of a horse, the price of which her husband had ordered
to be given to the poor, devises the plan of selling the horse for
one ducat only, adding, however, to the bargain a cat at ninety-
nine.
Tale LVI. Notable deception practised by an old Grey Friar of
Padua, who, being charged by a widow to find a husband for her
daughter, did, for the sake of getting the dowry, cause her to marry
a young Grey Friar, his comrade, whose condition, however, was
before long discovered.
Tale LVII. Singular behaviour of an English lord, who is content
merely to keep and wear upon his doublet the glove of a lady
whom he loves.
Tale LVIII. A lady at the Court of Francis I., wishing to prove that
she has no commerce with a certain gentleman who loves her,
gives him a pretended tryst and causes him to pass for a thief.
Tale LIX. Story of the same lady, who, learning that her husband is
in love with her waiting-woman, contrives to surprise him and
impose her own terms upon him.
Tale LX. A man of Paris, thinking his wife to be well and duly
deceased, marries again, but at the end of fifteen years is forced to
take his first wife back, although she has been living meantime
with one of the chanters of Louis XII.
SEVENTH DAY.
Prologue
Tale LXI. Great kindness of a husband, who consents to take back
his wife twice over, spite of her wanton love for a Canon of Autun.
Tale LXII. How a lady, while telling a story as of another, let her
tongue trip in such a way as to show that what she related had
happened to herself.
75. Tale LXIII. How the honourable behaviour of a young lord, who
feigns sickness in order to be faithful to his wife, spoils a party in
which he was to have made one with the King, and in this way
saves the honour of three maidens of Paris.
Tale LXIV. Story of a gentleman of Valencia in Spain, whom a lady
drove to such despair that he became a monk, and whom
afterwards she strove in vain to win back to herself.
Tale LXV. Merry mistake of a worthy woman, who in the church of
St. John of Lyons mistakes a sleeping soldier for one of the statues
on a tomb, and sets a lighted candle on his forehead.
Tale LXVI. How an old serving-woman, thinking to surprise a
Prothonotary with a lady, finds herself insulting Anthony de
Bourbon and his wife Jane d’Albret.
Tale LXVII. How the Sire de Robertval, granting a traitor his life at
the prayers of the man’s wife, set them both down on a desert
island, and how, after the husband’s death, the wife was rescued
and brought back to La Rochelle.
Tale LXVIII. The wife of an apothecary at Pau, hearing her
husband give some powder of cantharides to a woman who was
godmother with himself, secretly administered to him such a dose
of the same drug that he nearly died.
Tale LXIX. How the wife of one of the King’s Equerries surprised
her husband muffled in the hood of their servant-maid, and bolting
meal in her stead.
Tale LXX. Of the love of a Duchess of Burgundy for a gentleman
who rejects her advances, for which reason she accuses him to the
Duke her husband, and the latter does not believe his oaths till
assured by him that he loves the Lady du Vergier. Then the
Duchess, having drawn knowledge of this amour from her
husband, addresses to the Lady du Vergier in public, an allusion
that causes the death of both lovers; and the Duke, in despair at
his own lack of discretion, stabs the Duchess himself.
EIGHTH DAY.
76. Prologue
Tale LXXI. The wife of a saddler of Amboise is saved on her
deathbed through a fit of anger at seeing her husband fondle a
servant-maid.
Tale LXXII. Kindness of the Duchess of Alençon to a poor nun
whom she meets at Lyons, on her way to Rome, there to confess
to the Pope how a monk had wronged her, and to obtain his
Holiness’s pardon.
77. *** END OF THE PROJECT GUTENBERG EBOOK INDEX OF THE
PROJECT GUTENBERG WORKS OF MARGUERITE, QUEEN OF
NAVARRE ***
Updated editions will replace the previous one—the old editions will
be renamed.
Creating the works from print editions not protected by U.S.
copyright law means that no one owns a United States copyright in
these works, so the Foundation (and you!) can copy and distribute it
in the United States without permission and without paying
copyright royalties. Special rules, set forth in the General Terms of
Use part of this license, apply to copying and distributing Project
Gutenberg™ electronic works to protect the PROJECT GUTENBERG™
concept and trademark. Project Gutenberg is a registered trademark,
and may not be used if you charge for an eBook, except by following
the terms of the trademark license, including paying royalties for use
of the Project Gutenberg trademark. If you do not charge anything
for copies of this eBook, complying with the trademark license is
very easy. You may use this eBook for nearly any purpose such as
creation of derivative works, reports, performances and research.
Project Gutenberg eBooks may be modified and printed and given
away—you may do practically ANYTHING in the United States with
eBooks not protected by U.S. copyright law. Redistribution is subject
to the trademark license, especially commercial redistribution.
START: FULL LICENSE
79. PLEASE READ THIS BEFORE YOU DISTRIBUTE OR USE THIS WORK
To protect the Project Gutenberg™ mission of promoting the free
distribution of electronic works, by using or distributing this work (or
any other work associated in any way with the phrase “Project
Gutenberg”), you agree to comply with all the terms of the Full
Project Gutenberg™ License available with this file or online at
www.gutenberg.org/license.
Section 1. General Terms of Use and
Redistributing Project Gutenberg™
electronic works
1.A. By reading or using any part of this Project Gutenberg™
electronic work, you indicate that you have read, understand, agree
to and accept all the terms of this license and intellectual property
(trademark/copyright) agreement. If you do not agree to abide by all
the terms of this agreement, you must cease using and return or
destroy all copies of Project Gutenberg™ electronic works in your
possession. If you paid a fee for obtaining a copy of or access to a
Project Gutenberg™ electronic work and you do not agree to be
bound by the terms of this agreement, you may obtain a refund
from the person or entity to whom you paid the fee as set forth in
paragraph 1.E.8.
1.B. “Project Gutenberg” is a registered trademark. It may only be
used on or associated in any way with an electronic work by people
who agree to be bound by the terms of this agreement. There are a
few things that you can do with most Project Gutenberg™ electronic
works even without complying with the full terms of this agreement.
See paragraph 1.C below. There are a lot of things you can do with
Project Gutenberg™ electronic works if you follow the terms of this
agreement and help preserve free future access to Project
Gutenberg™ electronic works. See paragraph 1.E below.
80. 1.C. The Project Gutenberg Literary Archive Foundation (“the
Foundation” or PGLAF), owns a compilation copyright in the
collection of Project Gutenberg™ electronic works. Nearly all the
individual works in the collection are in the public domain in the
United States. If an individual work is unprotected by copyright law
in the United States and you are located in the United States, we do
not claim a right to prevent you from copying, distributing,
performing, displaying or creating derivative works based on the
work as long as all references to Project Gutenberg are removed. Of
course, we hope that you will support the Project Gutenberg™
mission of promoting free access to electronic works by freely
sharing Project Gutenberg™ works in compliance with the terms of
this agreement for keeping the Project Gutenberg™ name associated
with the work. You can easily comply with the terms of this
agreement by keeping this work in the same format with its attached
full Project Gutenberg™ License when you share it without charge
with others.
1.D. The copyright laws of the place where you are located also
govern what you can do with this work. Copyright laws in most
countries are in a constant state of change. If you are outside the
United States, check the laws of your country in addition to the
terms of this agreement before downloading, copying, displaying,
performing, distributing or creating derivative works based on this
work or any other Project Gutenberg™ work. The Foundation makes
no representations concerning the copyright status of any work in
any country other than the United States.
1.E. Unless you have removed all references to Project Gutenberg:
1.E.1. The following sentence, with active links to, or other
immediate access to, the full Project Gutenberg™ License must
appear prominently whenever any copy of a Project Gutenberg™
work (any work on which the phrase “Project Gutenberg” appears,
or with which the phrase “Project Gutenberg” is associated) is
accessed, displayed, performed, viewed, copied or distributed: