Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SemanticWebImport plugin update #300

Open
wants to merge 64 commits into
base: master
Choose a base branch
from
Open
Changes from 1 commit
Commits
Show all changes
64 commits
Select commit Hold shift + click to select a range
e939bd3
Updating plugin for gephi 0.9
ErwanDemairy Jan 6, 2016
a1f547b
Creates the columns in the table and fills them.
ErwanDemairy Jan 7, 2016
543842c
Merge branch 'master' of github.com:gephi/gephi-plugins into semantic…
ErwanDemairy Jan 8, 2016
3c9c08b
Removed misplaced files.
ErwanDemairy Jan 12, 2016
1bd086f
Merge branch 'master' of github.com:gephi/gephi-plugins
ErwanDemairy Jan 12, 2016
ba31668
Merge branch 'semantic-web-i'
ErwanDemairy Jan 12, 2016
3dd6b21
Merge branch 'master' of github.com:gephi/gephi-plugins
ErwanDemairy Feb 9, 2016
468e555
Relaxing the tests on the path so that the plugin can run.
ErwanDemairy Feb 9, 2016
09bec61
Return of the SOAP endpoint driver.
ErwanDemairy Feb 9, 2016
61686d2
Turned a constructor public to remove an exception when creating the …
ErwanDemairy Feb 9, 2016
259fe0f
Removed the SemanticTweet crawler, since the service does not exist a…
ErwanDemairy Feb 9, 2016
af01cec
Catch the exception if the URL is not correct, and display a dialog.
ErwanDemairy Feb 9, 2016
d5b4452
Removed a dependency already asked by corese modules.
ErwanDemairy Feb 9, 2016
086872d
Removed the calls to AttributeColumnsController. Addition of addAttri…
ErwanDemairy Feb 10, 2016
72d25e7
Replaced the e.printStackTrace() occurences by Exceptions.printStackT…
ErwanDemairy Feb 10, 2016
6fd59ed
Updated/cleaned the examples of configurations. Corrected a lack of n…
ErwanDemairy Feb 11, 2016
a467e24
Updating the corese version used to 3.2.0.
ErwanDemairy Feb 11, 2016
7e439b6
Setting Corese version used as 3.2.1 (remove the jws internal depende…
ErwanDemairy Feb 12, 2016
da76075
Removed any reference to the SOAP client until it is properly fixed.
ErwanDemairy Feb 23, 2016
2a46c73
Merge branch 'master' of github.com:gephi/gephi-plugins
ErwanDemairy Feb 23, 2016
1f6dad4
Updating to gephi 0.9.1.
ErwanDemairy Feb 23, 2016
7468927
Merge remote-tracking branch 'gephi-plugins/master'
ErwanDemairy Oct 25, 2022
9caa5b3
- update to gephi 0.9.7;
ErwanDemairy Jan 3, 2023
bd9c44a
Merge remote-tracking branch 'gephi-plugins/master'
ErwanDemairy Feb 13, 2023
c56ba50
- update to gephi 0.10.1;
ErwanDemairy Apr 4, 2023
e654b98
Changed http: to https:
ErwanDemairy Apr 4, 2023
2af1621
Update to corese 4.4.0.
ErwanDemairy Apr 4, 2023
89816fb
Merge remote-tracking branch 'gephi-plugins/master'
ErwanDemairy Apr 4, 2023
efa12b5
Add a changelog.
ErwanDemairy Apr 4, 2023
cce69e6
update of the gephi-plugin-parent dependency.
ErwanDemairy Apr 4, 2023
f079aea
ignore the log file.
ErwanDemairy Apr 4, 2023
ba7e2b8
Create codeql.yml
ErwanDemairy Apr 5, 2023
a56467c
Solves issue #1.
ErwanDemairy Apr 5, 2023
4a9262a
Cleaning suggested by code analyzer.
ErwanDemairy Apr 5, 2023
a49126e
Cleaning suggested by code analyzer.
ErwanDemairy Apr 6, 2023
5dc9b02
Configuration to use sonarqube.
ErwanDemairy Apr 6, 2023
ac8da8c
correction of the configuration to use sonarqube.
ErwanDemairy Apr 6, 2023
88aef6a
Added the lacking try-with-resources.
ErwanDemairy Apr 7, 2023
099400a
Cleaning.
ErwanDemairy Apr 7, 2023
291825d
Check if a diviser is not 0. More some cleaning.
ErwanDemairy Apr 7, 2023
967816d
Removed a nullable reference.
ErwanDemairy Apr 7, 2023
f4589f9
Cleaning nullpointerexception problems.
ErwanDemairy Apr 7, 2023
ee0276a
Prevent node from being null.
ErwanDemairy Apr 7, 2023
d64a5ed
Resolved some problems given by sonar.
ErwanDemairy Apr 7, 2023
3db5071
Coding style update.
ErwanDemairy Apr 19, 2023
f748896
Removed annotations considered useless.
ErwanDemairy Apr 20, 2023
00e5251
Removed redundant local variables.
ErwanDemairy Apr 20, 2023
1f44f50
Cleaning code.
ErwanDemairy Apr 20, 2023
8181588
Cleaning code.
ErwanDemairy Apr 20, 2023
6c84cab
[Cleaning] renamed classes to make them consistent with others.
ErwanDemairy Apr 20, 2023
1bff13e
Merge remote-tracking branch 'origin/develop'
ErwanDemairy Apr 20, 2023
e1e22cd
Close #2.
ErwanDemairy Apr 20, 2023
17b2d66
Correction of the build for codeql.
ErwanDemairy Apr 25, 2023
9004211
update of maven plugin for new release.
ErwanDemairy Jun 15, 2023
5990f7d
Update of the readme to help knowing how to install the plugin.
ErwanDemairy Nov 16, 2023
b9abbb1
Update README-plugin.md
ErwanDemairy Nov 20, 2023
b269f45
Update README-plugin.md
ErwanDemairy Nov 20, 2023
a67379d
Build ok for gephi 0.10.0
ErwanDemairy Nov 7, 2024
c8dfc0a
Upgrade to corese-core and corese-gui 4.5.0.
ErwanDemairy Nov 12, 2024
639ee65
update to python 3 syntax.
ErwanDemairy Nov 12, 2024
43140c8
get rid of non-used deprecated observer/observable usage.
ErwanDemairy Nov 12, 2024
fd25956
added some template parameters to reduce the number of warnings.
ErwanDemairy Nov 12, 2024
9c0c2bf
Merge pull request #7 from Wimmics/feature/gephi_10_1
ErwanDemairy Nov 14, 2024
296bcfc
Update README-plugin.md
ErwanDemairy Nov 18, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Added the lacking try-with-resources.
Signed-off-by: Erwan Demairy <Erwan.Demairy@inria.fr>
  • Loading branch information
ErwanDemairy committed Apr 7, 2023
commit 88aef6ac99eb3a53a3fd28f7fce42b147cdb740d

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -36,58 +36,58 @@ public class RdfAnalyzer implements LongTask, Runnable {
private ProgressTicket progressTicket;
private int fynLevel;

/**
* Constructor.
*
* @param newModel Model to fill.
* @param newSparqlRequest SPARQL request to fill with the model.
*/
public RdfAnalyzer(final GraphModel newModel, final String newSparqlRequest, final int fynLevel) {
super();
this.model = newModel;
this.sparqlRequest = newSparqlRequest;
this.fynLevel = fynLevel;
postProcessor = new EmptyPostProcessor();
}

@Override
public final void run() {
logger.info("Begin: Building the implementation relationships graph.");
int waitSeconds = 5;
Progress.start(progressTicket, waitSeconds);

try {
Progress.progress(progressTicket);
sparqlRequestResult = driver.sparqlQuery(sparqlRequest);
/**
* Constructor.
*
* @param newModel Model to fill.
* @param newSparqlRequest SPARQL request to fill with the model.
*/
public RdfAnalyzer(final GraphModel newModel, final String newSparqlRequest, final int fynLevel) {
super();
this.model = newModel;
this.sparqlRequest = newSparqlRequest;
this.fynLevel = fynLevel;
postProcessor = new EmptyPostProcessor();
}

@Override
public final void run() {
logger.info("Begin: Building the implementation relationships graph.");
int waitSeconds = 5;
Progress.start(progressTicket, waitSeconds);

try {
Progress.progress(progressTicket);
sparqlRequestResult = driver.sparqlQuery(sparqlRequest);

InputStream rdf = new ByteArrayInputStream(getSparqlRequestResult().getBytes(Charset.forName("UTF-8")));
RdfParser parser = new RdfParser(rdf, model, fynLevel);

Progress.progress(progressTicket);
parser.parse();
logger.log(Level.INFO, "Number of triples parsed: {0}", parser.getTripleNumber());
} catch (Exception e) {
logger.log(Level.INFO, "error when obtaining the nodes and edges: {0}", e.getMessage());
Exceptions.printStackTrace(e);
}

Progress.progress(progressTicket);
try {
saveResult(getSparqlRequestResult());
} catch (Exception e) {
logger.log(Level.INFO, "error when saving the result: {0}", e.getMessage());
}
Progress.progress(progressTicket);
postProcessor.setModel(model);
postProcessor.run();

logger.info("End: Building the implementation relationships graph.");
Progress.finish(progressTicket);
}

public final void setPostProcessing(final LayoutExamplePostProcessor newPostProcessor) {
this.postProcessor = newPostProcessor;
}
Progress.progress(progressTicket);
parser.parse();
logger.log(Level.INFO, "Number of triples parsed: {0}", parser.getTripleNumber());
} catch (Exception e) {
logger.log(Level.INFO, "error when obtaining the nodes and edges: {0}", e.getMessage());
Exceptions.printStackTrace(e);
}

Progress.progress(progressTicket);
try {
saveResult(getSparqlRequestResult());
} catch (Exception e) {
logger.log(Level.INFO, "error when saving the result: {0}", e.getMessage());
}
Progress.progress(progressTicket);
postProcessor.setModel(model);
postProcessor.run();

logger.info("End: Building the implementation relationships graph.");
Progress.finish(progressTicket);
}

public final void setPostProcessing(final LayoutExamplePostProcessor newPostProcessor) {
this.postProcessor = newPostProcessor;
}

public final void setSparqlEngine(final SparqlDriver newDriver) {
this.driver = newDriver;
@@ -101,30 +101,30 @@ public void setSaveResult(String saveResultName) {
this.saveResultName = saveResultName;
}

private void saveResult(String sparqlRequestResult) throws FileNotFoundException, IOException {
if (this.saveResultName.isEmpty()) {
return;
}
FileOutputStream fSave = new FileOutputStream(saveResultName);
fSave.write(sparqlRequestResult.getBytes());
fSave.close();
}

/**
* @return the sparqlRequestResult
*/
public String getSparqlRequestResult() {
return sparqlRequestResult;
}

@Override
public boolean cancel() {
return true;
}

@Override
public void setProgressTicket(ProgressTicket pt) {
this.progressTicket = pt;
}
private void saveResult(String sparqlRequestResult) throws IOException {
if (this.saveResultName.isEmpty()) {
return;
}
try (var fSave = new FileOutputStream(saveResultName)) {
fSave.write(sparqlRequestResult.getBytes());
}
}

/**
* @return the sparqlRequestResult
*/
public String getSparqlRequestResult() {
return sparqlRequestResult;
}

@Override
public boolean cancel() {
return true;
}

@Override
public void setProgressTicket(ProgressTicket pt) {
this.progressTicket = pt;
}

}
Original file line number Diff line number Diff line change
@@ -200,20 +200,21 @@ static protected ArrayList<String> listFilesInJar(String jarName, String path) t
path = path.substring(1);
}
ArrayList<String> result = new ArrayList<String>();
JarFile jar = new JarFile(jarName);
Enumeration<JarEntry> entries = jar.entries();
while (entries.hasMoreElements()) {
String name = entries.nextElement().getName();
logger.log(Level.INFO, "{0}", name);
if (name.startsWith(path)) { //filter according to the path
String entry = name.substring(path.length());
int checkSubdir = entry.indexOf("/");
if (checkSubdir >= 0) {
// if it is a subdirectory, we just return the directory name
entry = entry.substring(0, checkSubdir);
}
if (!entry.isEmpty()) {
result.add(entry);
try (JarFile jar = new JarFile(jarName)) {
var entries = jar.entries();
while (entries.hasMoreElements()) {
var name = entries.nextElement().getName();
logger.log(Level.INFO, "{0}", name);
if (name.startsWith(path)) { //filter according to the path
var entry = name.substring(path.length());
int checkSubdir = entry.indexOf("/");
if (checkSubdir >= 0) {
// if it is a subdirectory, we just return the directory name
entry = entry.substring(0, checkSubdir);
}
if (!entry.isEmpty()) {
result.add(entry);
}
}
}
}
Original file line number Diff line number Diff line change
@@ -63,104 +63,103 @@ public final void setRdfRequest(String nodeRdfRequest) {
}
}

public SemanticWebImportParser() {
this(new RequestParameters(""));
}

public SemanticWebImportParser(RequestParameters requests) {
this.parameters = requests;
}

/**
* Asynchronous operation that fills the current workspace.
*
* @sa waitEndpopulateRDFGraph
*
* @param driverUsed SPARQL request driver.
* @param properties Parameters to use.
*/
public final void populateRDFGraph(SparqlDriver driverUsed, Properties properties, LongTaskListener listener) {
this.listener = listener;
boolean resetWorkspace = Boolean.parseBoolean(properties.getProperty(PluginProperties.RESET_WORKSPACE.getValue(), "false"));
boolean postProcessing = Boolean.parseBoolean(properties.getProperty(PluginProperties.POST_PROCESSING.getValue(), "false"));
String saveResultName = properties.getProperty(PluginProperties.SAVE_SPARQL_RESULT.getValue(), "");
int fynLevel = Integer.parseInt(properties.getProperty(PluginProperties.FYN_LEVEL.getValue(), "0"));

logger.log(Level.INFO, "resetWorkspace = {0}", resetWorkspace);
logger.log(Level.INFO, "postProcessing = {0}", postProcessing);

this.driver = driverUsed;
this.driver.setPluginProperties(properties);

GephiUtils.createProjectIfEmpty();
GephiUtils.createWorkspaceIfEmpty();

pc = Lookup.getDefault().lookup(ProjectController.class);
Workspace dataWorkspace = pc.getCurrentWorkspace();
GraphModel model = getCurrentGraphModel(dataWorkspace);
// @TODO how to reset the graph
public SemanticWebImportParser() {
this(new RequestParameters(""));
}

public SemanticWebImportParser(RequestParameters requests) {
this.parameters = requests;
}

/**
* Asynchronous operation that fills the current workspace.
*
* @param driverUsed SPARQL request driver.
* @param properties Parameters to use.
* @sa waitEndpopulateRDFGraph
*/
public final void populateRDFGraph(SparqlDriver driverUsed, Properties properties, LongTaskListener listener) {
this.listener = listener;
boolean resetWorkspace = Boolean.parseBoolean(properties.getProperty(PluginProperties.RESET_WORKSPACE.getValue(), "false"));
boolean postProcessing = Boolean.parseBoolean(properties.getProperty(PluginProperties.POST_PROCESSING.getValue(), "false"));
String saveResultName = properties.getProperty(PluginProperties.SAVE_SPARQL_RESULT.getValue(), "");
int fynLevel = Integer.parseInt(properties.getProperty(PluginProperties.FYN_LEVEL.getValue(), "0"));

logger.log(Level.INFO, "resetWorkspace = {0}", resetWorkspace);
logger.log(Level.INFO, "postProcessing = {0}", postProcessing);

this.driver = driverUsed;
this.driver.setPluginProperties(properties);

GephiUtils.createProjectIfEmpty();
GephiUtils.createWorkspaceIfEmpty();

pc = Lookup.getDefault().lookup(ProjectController.class);
Workspace dataWorkspace = pc.getCurrentWorkspace();
GraphModel model = getCurrentGraphModel(dataWorkspace);
// @TODO how to reset the graph
// if (resetWorkspace) {
// model.getNodeTable().clear();
// }
Table nodeTable = model.getNodeTable();
if (!nodeTable.hasColumn("namespace")) {
nodeTable.addColumn("namespace", String.class);
}

analyzer = initAnalyzer(model, parameters.getRdfRequest(), fynLevel);
analyzer.setSaveResult(saveResultName);
if (postProcessing) {
LayoutExamplePostProcessor postProcessor = new LayoutExamplePostProcessor();
analyzer.setPostProcessing(postProcessor);
}
executor.setLongTaskListener(this);
executor.execute(analyzer, analyzer, "Importing Semantic Graph", new LongTaskErrorHandler() {
@Override
public void fatalError(Throwable thrwbl) {
thrwbl.printStackTrace();
}
});

}

@Override
public void taskFinished(LongTask lt) {
if (waitEndPopulate.availablePermits() < 1) {
waitEndPopulate.release();
}
listener.taskFinished(lt);
}

public final void waitEndpopulateRDFGraph() throws InterruptedException {
waitEndPopulate.acquire();
}

private RdfAnalyzer initAnalyzer(GraphModel model, String rdfRequest, int fynLevel) {
driver.init();
RdfAnalyzer localAnalyzer = new RdfAnalyzer(model, rdfRequest, fynLevel);
localAnalyzer.setSparqlEngine(driver);
return localAnalyzer;
}

public RequestParameters getParameters() {
return parameters;
}

public void setParameters(RequestParameters newParameters) {
this.parameters = newParameters;
}

private GraphModel getCurrentGraphModel(final Workspace currentWorkspace) {
final GraphController currentGraphController = Lookup.getDefault().lookup(GraphController.class);
return currentGraphController.getGraphModel(currentWorkspace);
}

/**
* @return the rdfGraph
*/
public Graph getRdfGraph() {
return this.rdfGraph;
}
Table nodeTable = model.getNodeTable();
if (!nodeTable.hasColumn("namespace")) {
nodeTable.addColumn("namespace", String.class);
}

analyzer = initAnalyzer(model, parameters.getRdfRequest(), fynLevel);
analyzer.setSaveResult(saveResultName);
if (postProcessing) {
LayoutExamplePostProcessor postProcessor = new LayoutExamplePostProcessor();
analyzer.setPostProcessing(postProcessor);
}
executor.setLongTaskListener(this);
executor.execute(analyzer, analyzer, "Importing Semantic Graph", new LongTaskErrorHandler() {
@Override
public void fatalError(Throwable thrwbl) {
thrwbl.printStackTrace();
}
});

}

@Override
public void taskFinished(LongTask lt) {
if (waitEndPopulate.availablePermits() < 1) {
waitEndPopulate.release();
}
listener.taskFinished(lt);
}

public final void waitEndpopulateRDFGraph() throws InterruptedException {
waitEndPopulate.acquire();
}

private RdfAnalyzer initAnalyzer(GraphModel model, String rdfRequest, int fynLevel) {
driver.init();
RdfAnalyzer localAnalyzer = new RdfAnalyzer(model, rdfRequest, fynLevel);
localAnalyzer.setSparqlEngine(driver);
return localAnalyzer;
}

public RequestParameters getParameters() {
return parameters;
}

public void setParameters(RequestParameters newParameters) {
this.parameters = newParameters;
}

private GraphModel getCurrentGraphModel(final Workspace currentWorkspace) {
final GraphController currentGraphController = Lookup.getDefault().lookup(GraphController.class);
return currentGraphController.getGraphModel(currentWorkspace);
}

/**
* @return the rdfGraph
*/
public Graph getRdfGraph() {
return this.rdfGraph;
}

public String getLastRdfResult() {
return analyzer.getSparqlRequestResult();
Original file line number Diff line number Diff line change
@@ -7,6 +7,7 @@
package fr.inria.edelweiss.semantic.importer;

import fr.inria.edelweiss.sparql.corese.CoreseDriver;

import java.util.List;

/**
Original file line number Diff line number Diff line change
@@ -95,17 +95,17 @@ public static float convertFloat(final String id) {
return decodedNum;
}

// Get values of color size RGB (0->1)
public static float convertFloatColor(final String id) {
float decodedNum = 0, temp;
try {
temp = Float.parseFloat(id.replaceAll("\"", "\\\""));
decodedNum = (temp % 256) / 255;
} catch (NumberFormatException ex) {
Exceptions.printStackTrace(ex);
}
return decodedNum;
}
// Get values of color size RGB (0->1)
public static float convertFloatColor(final String id) {
float decodedNum = 0, temp;
try {
temp = Float.parseFloat(id.replaceAll("\"", "\\\""));
decodedNum = (temp % 256) / 255;
} catch (NumberFormatException ex) {
Exceptions.printStackTrace(ex);
}
return decodedNum;
}

//Split color values RGB (0,0,0)
public static String[] stringSplit(final String id) {
Original file line number Diff line number Diff line change
@@ -154,15 +154,15 @@ private void load_resource(final String fileName) throws LoadException {
*/
private void load_resource_workaround(final String fileName) throws IOException, LoadException {
int dotPos = fileName.lastIndexOf('.');
File tempFile = File.createTempFile("corese_input", '.' + fileName.substring(dotPos + 1, fileName.length()));
FileWriter outputTempFile = new FileWriter(tempFile);
final BufferedReader resource = new BufferedReader(new InputStreamReader(this.getClass().getResourceAsStream(fileName)));
String currentLine;
while ((currentLine = resource.readLine()) != null) {
outputTempFile.write(currentLine + '\n');
var tempFile = File.createTempFile("corese_input", '.' + fileName.substring(dotPos + 1, fileName.length()));
try (FileWriter outputTempFile = new FileWriter(tempFile);
var resource = new BufferedReader(new InputStreamReader(this.getClass().getResourceAsStream(fileName)))
) {
String currentLine;
while ((currentLine = resource.readLine()) != null) {
outputTempFile.write(currentLine + '\n');
}
}
resource.close();
outputTempFile.close();
loader.parse(tempFile.getAbsolutePath());
}

Original file line number Diff line number Diff line change
@@ -59,14 +59,15 @@ public void setType(String newType) {

@Override
public Graph filter(Graph graph) {
Formatter getSubtypesOfrequest = new Formatter().format(GET_ALLTYPES_OF, type);
String[][] nodesToKeepList = SemanticWebImportMainWindowTopComponent.getSparqlRequester().selectOnGraph(getSubtypesOfrequest.toString());
HashSet<String> nodesToKeepSet = convertStringArrayToSet(nodesToKeepList);
try (Formatter getSubtypesOfrequest = new Formatter().format(GET_ALLTYPES_OF, type)) {
String[][] nodesToKeepList = SemanticWebImportMainWindowTopComponent.getSparqlRequester().selectOnGraph(getSubtypesOfrequest.toString());
HashSet<String> nodesToKeepSet = convertStringArrayToSet(nodesToKeepList);

Node[] nodes = graph.getNodes().toArray();
for (Node node : nodes) {
if (!nodesToKeepSet.contains((String)node.getId())) {
graph.removeNode(node);
Node[] nodes = graph.getNodes().toArray();
for (Node node : nodes) {
if (!nodesToKeepSet.contains((String) node.getId())) {
graph.removeNode(node);
}
}
}
return graph;
Original file line number Diff line number Diff line change
@@ -9,6 +9,7 @@
import java.awt.event.ActionEvent;
import java.awt.event.ActionListener;


/**
*
* @author Erwan Demairy <Erwan.Demairy@inria.fr>
@@ -51,22 +52,22 @@ private void initComponents() {
javax.swing.GroupLayout layout = new javax.swing.GroupLayout(this);
this.setLayout(layout);
layout.setHorizontalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addGroup(layout.createSequentialGroup()
.addContainerGap()
.addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addComponent(sparqlRequestEditor, javax.swing.GroupLayout.DEFAULT_SIZE, 431, Short.MAX_VALUE)
.addComponent(updateButton, javax.swing.GroupLayout.Alignment.TRAILING, javax.swing.GroupLayout.DEFAULT_SIZE, javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE))
.addContainerGap())
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addGroup(layout.createSequentialGroup()
.addContainerGap()
.addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addComponent(sparqlRequestEditor, javax.swing.GroupLayout.DEFAULT_SIZE, 431, Short.MAX_VALUE)
.addComponent(updateButton, javax.swing.GroupLayout.Alignment.TRAILING, javax.swing.GroupLayout.DEFAULT_SIZE, javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE))
.addContainerGap())
);
layout.setVerticalGroup(
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addGroup(layout.createSequentialGroup()
.addContainerGap()
.addComponent(sparqlRequestEditor, javax.swing.GroupLayout.PREFERRED_SIZE, 128, javax.swing.GroupLayout.PREFERRED_SIZE)
.addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED)
.addComponent(updateButton)
.addContainerGap(javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE))
layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING)
.addGroup(layout.createSequentialGroup()
.addContainerGap()
.addComponent(sparqlRequestEditor, javax.swing.GroupLayout.PREFERRED_SIZE, 128, javax.swing.GroupLayout.PREFERRED_SIZE)
.addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED)
.addComponent(updateButton)
.addContainerGap(javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE))
);
}// </editor-fold>//GEN-END:initComponents

Original file line number Diff line number Diff line change
@@ -66,18 +66,18 @@ public void setType(String newType) {

@Override
public Graph filter(Graph graph) {
Formatter getSubtypesOfrequest = new Formatter().format(GET_SUBTYPES_OF, type);
try (Formatter getSubtypesOfrequest = new Formatter().format(GET_SUBTYPES_OF, type)) {

String[][] nodesToKeepList = SemanticWebImportMainWindowTopComponent.getSparqlRequester().selectOnGraph(getSubtypesOfrequest.toString());
HashSet<String> nodesToKeepSet = convertStringArrayToSet(nodesToKeepList);
String[][] nodesToKeepList = SemanticWebImportMainWindowTopComponent.getSparqlRequester().selectOnGraph(getSubtypesOfrequest.toString());
HashSet<String> nodesToKeepSet = convertStringArrayToSet(nodesToKeepList);

Node[] nodes = graph.getNodes().toArray();
for (Node node : nodes) {
if (!nodesToKeepSet.contains((String)node.getId())) {
graph.removeNode(node);
Node[] nodes = graph.getNodes().toArray();
for (Node node : nodes) {
if (!nodesToKeepSet.contains((String) node.getId())) {
graph.removeNode(node);
}
}
}

return graph;
}