Car NOT Goat
Presenter in a Bind
The presenter then has to open another door in the for loop
starting at line 28, but must not reveal the main prize. In the revealed
attribute, the object stores this door's index. The remaining third door's index is then saved in the alternate
attribute. The for
loop starting in line 50 iterates over 1,000 game shows, and the print()
statement on line 55 outputs their results in CSV format. This is, line by line for each show, the indices of the candidate door, the presenter door, the remaining door, and the winning door.
One-Hot Encoding
If the AI apprentice employs a neural network and feeds in individual shows as 3-tuples, each paired with a one-part result tuple in the training phase, it won't produce satisfying results because door indices aren't really relevant as numerical values; instead, they stand for categories, each door representing a different category. The AI expert transforms such datasets before the training run into categories using one-hot encoding. If a dataset provides values for n categories, the one-hot encoder shapes individual records as n-tuples, each of which has one element set to 1
, with the remaining elements set to
.
Figure 3 shows an example of how an input series like [2,1,2,0,1,0]
is converted into six one-hot-encoded matrix rows. The code in Listing 2 uses the to_categorical()
function from the np_utils
module of the keras.utils package to accomplish this. To return from one-hot encoding back to the original value later, use the argmax()
method provided by numpy
arrays.
Listing 2
onehot
Machine Learning
Armed with the input values in one-hot format, the three-layer neural network method defined in Listing 3 can now be fed with learning data. Important: The network also encodes the output values according to the one-hot method and therefore not only needs a single neuron on its output, but three of them, because both the training and, later on, the predicted values are available as 3-tuples, each of them indicating the winning door as a 1
in a sea of zeros.
Listing 3
learn
The saved training data generated by Listing 1 in shows.csv
is then read by Listing 3 in line 7. The first three elements of each line are the input data of the network (candidate door, presenter door, alternative door), and the last item indicates the index of the door to the main prize.
Line 12 transforms the desired output values into categories in one-hot encoding; lines 14 to 18 build the neural network with an entry layer, a hidden layer, and an output layer. All layers are of the Dense
type; thus, they are networked in a brain-like style connecting with all elements of adjacent layers. The Sequential
class of the Keras package holds the layers together. Line 20 compiles the neural network model; Listing 3 specifies the error function as binary_crossentropy
as the learning parameter and selects the adam
algorithm as the optimizer, which specializes in categorization problems.
In the three-layer model, 10 neurons receive the input data in the input layer and input_dim=3
sets the data width to 3
, since it consists of 3-tuples (values for three doors). The middle layer has three neurons, and the output layer also has three. The latter is, as mentioned above, the one-hot encoding of the results as categories.
« Previous 1 2 3 Next »
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.
News
-
Gnome 47.2 Now Available
Gnome 47.2 is now available for general use but don't expect much in the way of newness, as this is all about improvements and bug fixes.
-
Latest Cinnamon Desktop Releases with a Bold New Look
Just in time for the holidays, the developer of the Cinnamon desktop has shipped a new release to help spice up your eggnog with new features and a new look.
-
Armbian 24.11 Released with Expanded Hardware Support
If you've been waiting for Armbian to support OrangePi 5 Max and Radxa ROCK 5B+, the wait is over.
-
SUSE Renames Several Products for Better Name Recognition
SUSE has been a very powerful player in the European market, but it knows it must branch out to gain serious traction. Will a name change do the trick?
-
ESET Discovers New Linux Malware
WolfsBane is an all-in-one malware that has hit the Linux operating system and includes a dropper, a launcher, and a backdoor.
-
New Linux Kernel Patch Allows Forcing a CPU Mitigation
Even when CPU mitigations can consume precious CPU cycles, it might not be a bad idea to allow users to enable them, even if your machine isn't vulnerable.
-
Red Hat Enterprise Linux 9.5 Released
Notify your friends, loved ones, and colleagues that the latest version of RHEL is available with plenty of enhancements.
-
Linux Sees Massive Performance Increase from a Single Line of Code
With one line of code, Intel was able to increase the performance of the Linux kernel by 4,000 percent.
-
Fedora KDE Approved as an Official Spin
If you prefer the Plasma desktop environment and the Fedora distribution, you're in luck because there's now an official spin that is listed on the same level as the Fedora Workstation edition.
-
New Steam Client Ups the Ante for Linux
The latest release from Steam has some pretty cool tricks up its sleeve.