This script accompanies the paper “fcaR, Formal Concept Analysis with R” that presents the fcaR package.
The script can be sourced or run line by line, or one can use “knitr::spin('./fcaR.R')” to run everything and obtain an HTML file with all the results.
library(magrittr)
if (!require(fcaR)) {
# Choose the version to install as needed.
# Last version submitted to CRAN
# install.packages("fcaR")
# Development version
remotes::install_github("Malaga-FCA-group/fcaR",
dependencies = TRUE)
}
In case the package “hasseDiagram” has not been installed (it is necessary for the plot of concept lattices), we can force it with (maybe indicating previously to use Bioconductor repositories with setRepositories()
):
# install.packages("hasseDiagram")
Then, we can load the library:
library(fcaR)
The following code defines the formal context that is used as example for in sections “Background on FCA” and “Formal concept analysis with fcaR”
objects <- paste0("O", 1:4)
n_objects <- length(objects)
attributes <- paste0("P", 1:4)
n_attributes <- length(attributes)
I <- matrix(data = c(0, 0.5, 0.5, 0,
0.5, 1, 1, 0.5,
0.5, 0, 0, 1,
0.5, 1, 1, 0.5),
nrow = n_objects,
byrow = FALSE)
colnames(I) <- attributes
rownames(I) <- objects
fc <- FormalContext$new(I)
The formal context can be printed with
fc
## FormalContext with 4 objects and 4 attributes.
## P1 P2 P3 P4
## O1 0 0.5 0.5 0.5
## O2 0.5 1 0 1
## O3 0.5 1 0 1
## O4 0 0.5 1 0.5
In the paper, it is exported to LaTeX, using:
fc$to_latex(fraction = "sfrac")
## \begin{table} \centering \begin{tabular}{lcccc}
## \toprule
## & P1 & P2 & P3 & P4\\
## \midrule
## O1 & 0 & \sfrac{1}{2} & \sfrac{1}{2} & \sfrac{1}{2}\\
## O2 & \sfrac{1}{2} & 1 & 0 & 1\\
## O3 & \sfrac{1}{2} & 1 & 0 & 1\\
## O4 & 0 & \sfrac{1}{2} & 1 & \sfrac{1}{2}\\
## \bottomrule
## \end{tabular} \caption{\label{}} \end{table}
The examples in this section were calculated using fcaR. For instance, the sets S and T are defined as follows:
S <- Set$new(fc$objects, O1 = 1, O2 = 1)
T <- Set$new(fc$attributes, P1 = 0.5)
intent(S), extent(T) and closure(T) (in the paper the operator was named “phi”) are calculated with:
intentS <- fc$intent(S)
intentS
## {P2 [0.5], P4 [0.5]}
extentT <- fc$extent(T)
extentT
## {O2, O3}
phiT <- fc$closure(T)
phiT
## {P1 [0.5], P2, P4}
The example of the concept that appears in that section requires to compute first the concept lattice. We find the concept lattice with:
fc$find_concepts()
Then, we take the 6th concept (for example)
C <- fc$concepts$sub(6)
If we call (A, B) to the left- and right- sides of C, we can check that A = extent(B) and B = intent(A). To extract these components of a Concept, we use the “get_extent()” and “get_intent()” methods.
A <- C$get_extent()
A
## {O2, O3}
B <- C$get_intent()
B
## {P1 [0.5], P2, P4}
fc$extent(B) # This should match A
## {O2, O3}
fc$intent(A) # This should match B
## {P1 [0.5], P2, P4}
Another way to check this is by using the %==% operator:
A %==% fc$extent(B)
## [1] TRUE
B %==% fc$intent(A)
## [1] TRUE
In the paper, we make extensive use of exporting variables to LaTeX, just by using the “to_latex()” method:
A$to_latex()
## \ensuremath{\left\{\mathrm{O2},\, \mathrm{O3}\right\}}
The example of the order relationship between concepts uses the second subconcept of the C previously computed:
Csub <- fc$concepts$subconcepts(6)$sub(2)
We can check that Csub is a subconcept of C using the operator %<=%:
Csub %<=% C
## [1] TRUE
To plot the concept lattice (Figure 1):
fc$concepts$plot()
We compute the Duquenne-Guigues basis of implications using:
fc$find_implications()
These implications are the basis of valid implications:
fc$implications
## Implication set with 6 implications.
## Rule 1: {} -> {P2 [0.5], P4 [0.5]}
## Rule 2: {P2 [0.5], P4} -> {P2}
## Rule 3: {P2, P4 [0.5]} -> {P4}
## Rule 4: {P2, P3 [0.5], P4} -> {P3}
## Rule 5: {P1 [0.5], P2 [0.5], P4 [0.5]} -> {P2, P4}
## Rule 6: {P1 [0.5], P2, P3, P4} -> {P1}
Again, they are exported to LaTeX with:
fc$implications$to_latex()
## \begin{longtable*}{rrcl}
## 1: &\ensuremath{\varnothing}&\ensuremath{\Rightarrow}&\ensuremath{\left\{{^{0.5}}\!/\mathrm{P2},\, {^{0.5}}\!/\mathrm{P4}\right\}}\\
## 2: &\ensuremath{\left\{{^{0.5}}\!/\mathrm{P2},\, \mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P2}\right\}}\\
## 3: &\ensuremath{\left\{\mathrm{P2},\, {^{0.5}}\!/\mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P4}\right\}}\\
## 4: &\ensuremath{\left\{\mathrm{P2},\, {^{0.5}}\!/\mathrm{P3},\, \mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P3}\right\}}\\
## 5: &\ensuremath{\left\{{^{0.5}}\!/\mathrm{P1},\, {^{0.5}}\!/\mathrm{P2},\, {^{0.5}}\!/\mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P2},\, \mathrm{P4}\right\}}\\
## 6: &\ensuremath{\left\{{^{0.5}}\!/\mathrm{P1},\, \mathrm{P2},\, \mathrm{P3},\, \mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P1}\right\}}\\
## \end{longtable*}
To apply the equivalence rules mentioned in the section, it is advisable to clone the set of implications beforehand to not overwrite it:
imps <- fc$implications$clone()
# Here we apply Simplification and RightSimplification
imps$apply_rules(c("simp", "rsimp"))
The difference with the previous implications are in items 2 to 6:
imps[2:6]
## Implication set with 5 implications.
## Rule 1: {P4} -> {P2}
## Rule 2: {P2} -> {P4}
## Rule 3: {P3 [0.5], P4} -> {P3}
## Rule 4: {P1 [0.5]} -> {P4}
## Rule 5: {P1 [0.5], P3} -> {P1}
The actual code used in the paper to match the id of the implications is:
# indexes of rules that have changed
id_diff <- which(base::rowSums(abs(fc$implications$size() - imps$size())) > 0)
# Export to LaTeX only those rules with the given numbers
imps[id_diff]$to_latex(numbered = TRUE, numbers = id_diff)
## \begin{longtable*}{rrcl}
## 2: &\ensuremath{\left\{\mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P2}\right\}}\\
## 3: &\ensuremath{\left\{\mathrm{P2}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P4}\right\}}\\
## 4: &\ensuremath{\left\{{^{0.5}}\!/\mathrm{P3},\, \mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P3}\right\}}\\
## 5: &\ensuremath{\left\{{^{0.5}}\!/\mathrm{P1}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P4}\right\}}\\
## 6: &\ensuremath{\left\{{^{0.5}}\!/\mathrm{P1},\, \mathrm{P3}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P1}\right\}}\\
## \end{longtable*}
The example of computing the closure of a set S uses:
# Definition of S
S <- Set$new(fc$attributes, P2 = 0.5)
# Closure computation
Scl <- fc$implications$closure(S, reduce = TRUE)
The closure of S, named S+ in the paper, is
Scl$closure
## {P2 [0.5], P4 [0.5]}
The reduced set of implications is
Scl$implications
## Implication set with 5 implications.
## Rule 1: {P4} -> {P2}
## Rule 2: {P2} -> {P4}
## Rule 3: {P2, P3 [0.5], P4} -> {P3}
## Rule 4: {P1 [0.5]} -> {P2, P4}
## Rule 5: {P1 [0.5], P2, P3, P4} -> {P1}
The implications, as before, are exported to LaTeX with
Scl$implications$to_latex()
## \begin{longtable*}{rrcl}
## 1: &\ensuremath{\left\{\mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P2}\right\}}\\
## 2: &\ensuremath{\left\{\mathrm{P2}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P4}\right\}}\\
## 3: &\ensuremath{\left\{\mathrm{P2},\, {^{0.5}}\!/\mathrm{P3},\, \mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P3}\right\}}\\
## 4: &\ensuremath{\left\{{^{0.5}}\!/\mathrm{P1}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P2},\, \mathrm{P4}\right\}}\\
## 5: &\ensuremath{\left\{{^{0.5}}\!/\mathrm{P1},\, \mathrm{P2},\, \mathrm{P3},\, \mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P1}\right\}}\\
## \end{longtable*}
To plot the heatmap of a formal context, we use (Figure 2 in the paper):
fc$plot()
In this section, we begin with examples similar to those of the “Background on FCA” section.
S <- Set$new(fc$objects, O1 = 1, O2 = 1)
S
## {O1, O2}
fc$intent(S)
## {P2 [0.5], P4 [0.5]}
T <- Set$new(fc$attributes, P1 = 1, P3 = 1)
T
## {P1, P3}
fc$extent(T)
## {}
fc$closure(T)
## {P1, P2, P3, P4}
In addition, we show how to perform clarification on the formal context fc
fc_cla <- fc$clarify(TRUE)
fc_cla
## FormalContext with 3 objects and 3 attributes.
## P1 P3 [P2, P4]
## O1 0 0.5 0.5
## O4 0 1 0.5
## [O2, O3] 0.5 0 1
Again, to compute the concept lattice
fc$find_concepts()
And to show the list of concepts
fc$concepts
## A set of 8 concepts:
## 1: ({O1, O2, O3, O4}, {P2 [0.5], P4 [0.5]})
## 2: ({O1, O4}, {P2 [0.5], P3 [0.5], P4 [0.5]})
## 3: ({O1 [0.5], O4}, {P2 [0.5], P3, P4 [0.5]})
## 4: ({O1 [0.5], O2, O3, O4 [0.5]}, {P2, P4})
## 5: ({O1 [0.5], O4 [0.5]}, {P2, P3, P4})
## 6: ({O2, O3}, {P1 [0.5], P2, P4})
## 7: ({O2 [0.5], O3 [0.5]}, {P1, P2, P4})
## 8: ({}, {P1, P2, P3, P4})
Size of the concept lattice: number of concepts:
fc$concepts$size()
## [1] 8
One can take just a subset of the concepts using the standard R notation
fc$concepts[c(1:3, 5, 8)]
## A set of 5 concepts:
## 1: ({O1, O2, O3, O4}, {P2 [0.5], P4 [0.5]})
## 2: ({O1, O4}, {P2 [0.5], P3 [0.5], P4 [0.5]})
## 3: ({O1 [0.5], O4}, {P2 [0.5], P3, P4 [0.5]})
## 4: ({O1 [0.5], O4 [0.5]}, {P2, P3, P4})
## 5: ({}, {P1, P2, P3, P4})
One could use negative indexes and a boolean vector to subset:
fc$concepts[-c(1:3)]
## A set of 5 concepts:
## 1: ({O1 [0.5], O2, O3, O4 [0.5]}, {P2, P4})
## 2: ({O1 [0.5], O4 [0.5]}, {P2, P3, P4})
## 3: ({O2, O3}, {P1 [0.5], P2, P4})
## 4: ({O2 [0.5], O3 [0.5]}, {P1, P2, P4})
## 5: ({}, {P1, P2, P3, P4})
To compute the support of each concept, we use the “support()” method:
fc$concepts$support()
## [1] 1.00 0.50 0.25 0.50 0.00 0.50 0.00 0.00
To build a sublattice, we can use the indexes of the “generating” concepts:
fc$concepts$sublattice(5:6)
## A set of 4 concepts:
## 1: ({O1 [0.5], O2, O3, O4 [0.5]}, {P2, P4})
## 2: ({O1 [0.5], O4 [0.5]}, {P2, P3, P4})
## 3: ({O2, O3}, {P1 [0.5], P2, P4})
## 4: ({}, {P1, P2, P3, P4})
Or we could use a ConceptSet to generate the sublattice:
fc$concepts$sublattice(fc$concepts[5:6])
## A set of 4 concepts:
## 1: ({O1 [0.5], O2, O3, O4 [0.5]}, {P2, P4})
## 2: ({O1 [0.5], O4 [0.5]}, {P2, P3, P4})
## 3: ({O2, O3}, {P1 [0.5], P2, P4})
## 4: ({}, {P1, P2, P3, P4})
The concept named L in the paper is obtained with:
L <- fc$concepts$sub(4)
The list of subconcepts and superconcepts of L
fc$concepts$subconcepts(L)
## A set of 5 concepts:
## 1: ({O1 [0.5], O2, O3, O4 [0.5]}, {P2, P4})
## 2: ({O1 [0.5], O4 [0.5]}, {P2, P3, P4})
## 3: ({O2, O3}, {P1 [0.5], P2, P4})
## 4: ({O2 [0.5], O3 [0.5]}, {P1, P2, P4})
## 5: ({}, {P1, P2, P3, P4})
fc$concepts$superconcepts(L)
## A set of 2 concepts:
## 1: ({O1, O2, O3, O4}, {P2 [0.5], P4 [0.5]})
## 2: ({O1 [0.5], O2, O3, O4 [0.5]}, {P2, P4})
To compute the supremum and infimum of a set of concepts, the methods are “supremum()” and “infimum()”. Let us build a ConceptSet for an example:
X <- fc$concepts[c(2, 3, 4)]
X
## A set of 3 concepts:
## 1: ({O1, O4}, {P2 [0.5], P3 [0.5], P4 [0.5]})
## 2: ({O1 [0.5], O4}, {P2 [0.5], P3, P4 [0.5]})
## 3: ({O1 [0.5], O2, O3, O4 [0.5]}, {P2, P4})
fc$concepts$infimum(X) # the largest common subconcept of all elements in X
## ({O1 [0.5], O4 [0.5]}, {P2, P3, P4})
fc$concepts$supremum(X) # the smallest common superconcept of all elements in X
## ({O1, O2, O3, O4}, {P2 [0.5], P4 [0.5]})
The irreducible elements are an essential part of lattice theory.
fc$concepts$meet_irreducibles()
## A set of 5 concepts:
## 1: ({O1, O4}, {P2 [0.5], P3 [0.5], P4 [0.5]})
## 2: ({O1 [0.5], O4}, {P2 [0.5], P3, P4 [0.5]})
## 3: ({O1 [0.5], O2, O3, O4 [0.5]}, {P2, P4})
## 4: ({O2, O3}, {P1 [0.5], P2, P4})
## 5: ({O2 [0.5], O3 [0.5]}, {P1, P2, P4})
fc$concepts$join_irreducibles()
## A set of 5 concepts:
## 1: ({O1, O4}, {P2 [0.5], P3 [0.5], P4 [0.5]})
## 2: ({O1 [0.5], O4}, {P2 [0.5], P3, P4 [0.5]})
## 3: ({O1 [0.5], O4 [0.5]}, {P2, P3, P4})
## 4: ({O2, O3}, {P1 [0.5], P2, P4})
## 5: ({O2 [0.5], O3 [0.5]}, {P1, P2, P4})
The standard context condenses the same knowledge as in the original formal context. It is formed with the join- and meet- irreducible elements of the original context, with the order relationship in the concept lattice, <=.
To compute the standard context, we use the “standardize()” method.
fc_std <- fc$standardize()
fc_std
## FormalContext with 5 objects and 5 attributes.
## M1 M2 M3 M4 M5
## J1 X
## J2 X X
## J3 X X X
## J4 X X
## J5 X X X
In the paper, the table of the standard context was generated with:
fc_std$to_latex(table = FALSE)
## \begin{tabular}{lccccc}
## \toprule
## & M1 & M2 & M3 & M4 & M5\\
## \midrule
## J1 & $\times$ & & & & \\
## J2 & $\times$ & $\times$ & & & \\
## J3 & $\times$ & $\times$ & $\times$ & & \\
## J4 & & & $\times$ & $\times$ & \\
## J5 & & & $\times$ & $\times$ & $\times$\\
## \bottomrule
## \end{tabular}
Its concept lattice, which is isomorphic to that of the original formal context, is computed and plotted with:
fc_std$find_concepts()
fc_std$concepts$plot(object_names = TRUE)
To compute the Duquenne-Guigues basis of implications for the formal context fc, we use
fc$find_implications()
The complete basis can be printed with
fc$implications
## Implication set with 6 implications.
## Rule 1: {} -> {P2 [0.5], P4 [0.5]}
## Rule 2: {P2 [0.5], P4} -> {P2}
## Rule 3: {P2, P4 [0.5]} -> {P4}
## Rule 4: {P2, P3 [0.5], P4} -> {P3}
## Rule 5: {P1 [0.5], P2 [0.5], P4 [0.5]} -> {P2, P4}
## Rule 6: {P1 [0.5], P2, P3, P4} -> {P1}
As with the concepts, we can use standard R subsetting to extract only selected implications:
fc$implications[1:3]
## Implication set with 3 implications.
## Rule 1: {} -> {P2 [0.5], P4 [0.5]}
## Rule 2: {P2 [0.5], P4} -> {P2}
## Rule 3: {P2, P4 [0.5]} -> {P4}
To export them to LaTeX, we do.
fc$implications[1:3]$to_latex()
## \begin{longtable*}{rrcl}
## 1: &\ensuremath{\varnothing}&\ensuremath{\Rightarrow}&\ensuremath{\left\{{^{0.5}}\!/\mathrm{P2},\, {^{0.5}}\!/\mathrm{P4}\right\}}\\
## 2: &\ensuremath{\left\{{^{0.5}}\!/\mathrm{P2},\, \mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P2}\right\}}\\
## 3: &\ensuremath{\left\{\mathrm{P2},\, {^{0.5}}\!/\mathrm{P4}\right\}}&\ensuremath{\Rightarrow}&\ensuremath{\left\{\mathrm{P4}\right\}}\\
## \end{longtable*}
Filtering of implications: We can filter those implications that fulfill certain conditions
fc$implications$filter(lhs = c("P1", "P2"),
rhs = "P4")
## Implication set with 2 implications.
## Rule 1: {P2, P4 [0.5]} -> {P4}
## Rule 2: {P1 [0.5], P2 [0.5], P4 [0.5]} -> {P2, P4}
The cardinality of the implication set is the number of implications:
fc$implications$cardinality()
## [1] 6
The size of an implication is the cardinality (as in set theory) of its LHS and RHS. The “size()” method can be applied to sets of implications so it will return the detailed sizes of each:
# For example, take 2 implications and compute their sizes:
my_imps <- fc$implications[4:5]
my_imps
## Implication set with 2 implications.
## Rule 1: {P2, P3 [0.5], P4} -> {P3}
## Rule 2: {P1 [0.5], P2 [0.5], P4 [0.5]} -> {P2, P4}
my_imps$size()
## LHS RHS
## [1,] 2.5 1
## [2,] 1.5 2
The supports of a set of implications can be computed as:
fc$implications$support()
## [1] 1.0 0.5 0.5 0.0 0.5 0.0
The equivalence rules implemented are in the “equivalenceRegistry” object:
equivalencesRegistry
## An object of class 'registry' with 6 entries.
The user can obtain the names of the implmented rules with:
equivalencesRegistry$get_entry_names()
## [1] "Composition" "Generalization"
## [3] "Reduction" "Simplification"
## [5] "Right Simplification" "Reorder"
The information of one of those entries is retrieved by:
equivalencesRegistry$get_entry("Composition")
## method Composition
## fun <<function>>
## description A -> B and A -> C equivalent to
## A -> BC
The application of the equivalence rules to the implication basis can be made with:
fc$implications$apply_rules(
rules = c("reduction",
"comp",
"gener",
"simpl"))
# Let us inspect the result
fc$implications
## Implication set with 6 implications.
## Rule 1: {} -> {P2 [0.5], P4 [0.5]}
## Rule 2: {P4} -> {P2}
## Rule 3: {P2} -> {P4}
## Rule 4: {P3 [0.5], P4} -> {P3}
## Rule 5: {P1 [0.5]} -> {P2, P4}
## Rule 6: {P1 [0.5], P3} -> {P1}
The example of computing the closure requires to define the Set A:
A <- Set$new(attributes = fc$attributes,
P2 = 1)
The actual computation occurs in this line, and it returns the closure, and the reduced implication set.
fc$implications$closure(A, reduce = TRUE)
## $closure
## {P2, P4}
##
## $implications
## Implication set with 2 implications.
## Rule 1: {P3 [0.5]} -> {P3}
## Rule 2: {P1 [0.5], P3} -> {P1}
Let us load the cobre32 dataset.
One can get help about the dataset with ?cobre32
fc <- FormalContext$new(cobre32)
This formal context may be plotted with (resulting in Figure 3 in the paper):
fc$plot()
Let us compute the implication basis. It is time consuming, depending on the hardware. It may take from 30 seconds to several minutes.
fc$find_implications()
Number of implications and concepts:
fc$concepts$size()
## [1] 14706
fc$implications$cardinality()
## [1] 985
We are going to check how the sizes of the implications decrease when using the simplification logic.
# Sizes before applying the logic
pre_logic <- colMeans(fc$implications$size())
pre_logic
## LHS RHS
## 2.417597 1.954146
We apply the logic
fc$implications$apply_rules(
rules = c("simplification",
"rsimplification"))
# Sizes after applying the logic:
post_logic <- colMeans(fc$implications$size())
post_logic
## LHS RHS
## 1.998308 1.557191
This is the function that is used in the paper for the diagnosis system
diagnose <- function(S) {
fc$implications$recommend(S = S,
attribute_filter =
c("dx_ss", "dx_other"))
}
These are the same examples as in the paper.
First, we define a Set and the execute the “diagnose()” function on it.
S1 <- Set$new(attributes = fc$attributes,
COSAS_1 = 1/2, COSAS_2 = 1, COSAS_3 = 1/2,
COSAS_4 = 1/6, COSAS_5 = 1/2, COSAS_6 = 1)
diagnose(S1)
## dx_ss dx_other
## 1 0
S2 <- Set$new(attributes = fc$attributes,
COSAS_2 = 1, COSAS_6 = 1, FICAL_1 = 1/3, FICAL_3 = 1/3)
diagnose(S2)
## dx_ss dx_other
## 0 0
S3 <- Set$new(attributes = fc$attributes,
COSAS_4 = 2/3, FICAL_3 = 1/2, FICAL_5 = 1/2, FICAL_8 = 1/2)
diagnose(S3)
## dx_ss dx_other
## 0 1
Since for S2 there was not a diagnosis, we can use the simplification logic closure to obtain the reduced set of implications:
cl <- fc$implications$closure(S2, reduce = TRUE)
# We apply again the equivalence rules to obtain an even more compact implication set.
cl$implications$apply_rules(
rules = c("simp", "rsimp", "reorder"))
# And, finally, we filter the implications that have a diagnosis in the RHS, and not in the LHS:
cl$implications$filter(rhs = c("dx_ss", "dx_other"),
not_lhs = c("dx_ss", "dx_other"),
drop = TRUE)
## Implication set with 12 implications.
## Rule 1: {FICAL_5 [0.33]} -> {dx_other}
## Rule 2: {FICAL_6 [0.33], FICAL_8 [0.33]} ->
## {dx_ss}
## Rule 3: {SCIDII_18 [0.33]} -> {dx_ss}
## Rule 4: {COSAS_1 [0.5], FICAL_8 [0.33]} -> {dx_ss}
## Rule 5: {SCIDII_20 [0.33]} -> {dx_ss}
## Rule 6: {SCIDII_16 [0.33]} -> {dx_ss}
## Rule 7: {SCIDII_12 [0.33]} -> {dx_ss}
## Rule 8: {FICAL_7 [0.33]} -> {dx_ss}
## Rule 9: {FICAL_6 [0.33], SCIDII_10 [0.5]} ->
## {dx_ss}
## Rule 10: {COSAS_3 [0.5]} -> {dx_ss}
## Rule 11: {COSAS_1 [0.5]} -> {dx_ss}
## Rule 12: {SCIDII_10 [0.5]} -> {dx_ss}